Mixed-effects conditional logistic regression is evidently more effective in the study of qualitative differences in longitudinal pollution data as well as their implications on heterogeneous subgroups. This study seeks that conditional logistic regression is a robust evaluation method for environmental studies, thru the analysis of environment pollution as a function of oil production and environmental factors. Consequently, it has been established theoretically that the primary objective of model selection in this research is to identify the candidate model that is optimal for the conditional design. The candidate model should achieve generalizability, goodness-of-fit, parsimony and establish equilibrium between bias and variability. In the practical sphere it is however more realistic to capture the most significant parameters of the research design through the best fitted candidate model for this research. Simulation studies demonstrate that the mixed-effects conditional logistic regression is more accurate for pollution studies, with fixed-effects conditional logistic regression models potentially generating flawed conclusions. This is because mixed-effects conditional logistic regression provides detailed insights on clusters that were largely overlooked by fixed-effects conditional logistic regression.
In this paper, the effect size measures was discussed, which are useful in many estimation processes for direct effect and its relation with indirect and total effects. In addition, an algorithm to calculate the suggested measure of effect size was suggested that represent the ratio of direct effect to the effect of the estimated parameter using the Regression equation of the dependent variable on the mediator variable without using the independent variable in the model. Where this an algorithm clear the possibility to use this regression equation in Mediation Analysis, where usually used the Mediator and independent variable together when the dependent variable regresses on them. Also this an algorithm to show how effect of the
... Show MoreSteganography is a technique of concealing secret data within other quotidian files of the same or different types. Hiding data has been essential to digital information security. This work aims to design a stego method that can effectively hide a message inside the images of the video file. In this work, a video steganography model has been proposed through training a model to hiding video (or images) within another video using convolutional neural networks (CNN). By using a CNN in this approach, two main goals can be achieved for any steganographic methods which are, increasing security (hardness to observed and broken by used steganalysis program), this was achieved in this work as the weights and architecture are randomized. Thus,
... Show MoreTitanium dioxide nanoparticles (TiO2 NPs) are generally used in different types of applications such as the industry of plastics, paper industry, paints, toothpaste, cosmetics, sunscreens, and in various lifestyles, because of the vast range of applications and our daily exposure to these nanoparticles and a lack of information on animal and human health this study was designed to reveal dose and time-dependent effects of TiO2-NPs on the thyroid gland and kidney functions in male rats.
For this study 54, Sprague-Dawley albino adult male rats were classified into three main groups each of 18 rats treated for a particular duration (1,2, and 4) weeks respectively. Each group was subdivided i
... Show MoreThe tagged research is concerned with observation and investigating the concepts of consistency and harmony in contemporary Iraqi painting (selected models) in order to reveal the mechanisms and rules of these two concepts in the artistic field and their mechanisms of operation. How reflected tools Consistency and harmony in contemporary Iraqi painting? What is consistency and what are its mechanisms and principles? Is consistency a unit product quality? Are there similarities between consistency and harmony? What is harmony and its principles and rules? As for the second chapter, it included two topics that dealt with the first topic - consistency and harmony between concept and significance, while the second topic meant - histor
... Show MoreIn this paper, The transfer function model in the time series was estimated using different methods, including parametric Represented by the method of the Conditional Likelihood Function, as well as the use of abilities nonparametric are in two methods local linear regression and cubic smoothing spline method, This research aims to compare those capabilities with the nonlinear transfer function model by using the style of simulation and the study of two models as output variable and one model as input variable in addition t
... Show MoreThe technology of reducing dimensions and choosing variables are very important topics in statistical analysis to multivariate. When two or more of the predictor variables are linked in the complete or incomplete regression relationships, a problem of multicollinearity are occurred which consist of the breach of one basic assumptions of the ordinary least squares method with incorrect estimates results.
There are several methods proposed to address this problem, including the partial least squares (PLS), used to reduce dimensional regression analysis. By using linear transformations that convert a set of variables associated with a high link to a set of new independent variables and unr
... Show MoreThe current study presents the simulative study and evaluation of MANET mobility models over UDP traffic pattern to determine the effects of this traffic pattern on mobility models in MANET which is implemented in NS-2.35 according to various performance metri (Throughput, AED (Average End-2-end Delay), drop packets, NRL (Normalize Routing Load) and PDF (Packet Delivery Fraction)) with various parameters such as different velocities, different environment areas, different number of nodes, different traffic rates, different traffic sources, different pause times and different simulation times . A routing protocol.…was exploited AODV(Adhoc On demand Distance Vector) and RWP (Random Waypoint), GMM (Gauss Markov Model), RPGM (Refere
... Show MoreEncryption of data is translating data to another shape or symbol which enables people only with an access to the secret key or a password that can read it. The data which are encrypted are generally referred to as cipher text, while data which are unencrypted are known plain text. Entropy can be used as a measure which gives the number of bits that are needed for coding the data of an image. As the values of pixel within an image are dispensed through further gray-levels, the entropy increases. The aim of this research is to compare between CAST-128 with proposed adaptive key and RSA encryption methods for video frames to determine the more accurate method with highest entropy. The first method is achieved by applying the "CAST-128" and
... Show More