Mixed-effects conditional logistic regression is evidently more effective in the study of qualitative differences in longitudinal pollution data as well as their implications on heterogeneous subgroups. This study seeks that conditional logistic regression is a robust evaluation method for environmental studies, thru the analysis of environment pollution as a function of oil production and environmental factors. Consequently, it has been established theoretically that the primary objective of model selection in this research is to identify the candidate model that is optimal for the conditional design. The candidate model should achieve generalizability, goodness-of-fit, parsimony and establish equilibrium between bias and variability. In the practical sphere it is however more realistic to capture the most significant parameters of the research design through the best fitted candidate model for this research. Simulation studies demonstrate that the mixed-effects conditional logistic regression is more accurate for pollution studies, with fixed-effects conditional logistic regression models potentially generating flawed conclusions. This is because mixed-effects conditional logistic regression provides detailed insights on clusters that were largely overlooked by fixed-effects conditional logistic regression.
As we know the transportation studies regarded as one of a very
important and difficult studies and one of its difficulties created from the
process of data updating therefore the researcher well facing many difficulties
to balancing between the old data on collecting new data.
The research present an opinion which is summarized by: can we use
the old data after we updated and used it as alternatives? Or the researcher
must collect new data to complete their research which indicate to the present
situation and some times they cant complete their studies because of the
security, economic, temporally difficulties.
The research used two kinds of data, the old data which belong to the
period (1998) and new data
summary
In this search, we examined the factorial experiments and the study of the significance of the main effects, the interaction of the factors and their simple effects by the F test (ANOVA) for analyze the data of the factorial experience. It is also known that the analysis of variance requires several assumptions to achieve them, Therefore, in case of violation of one of these conditions we conduct a transform to the data in order to match or achieve the conditions of analysis of variance, but it was noted that these transfers do not produce accurate results, so we resort to tests or non-parametric methods that work as a solution or alternative to the parametric tests , these method
... Show MoreVariable selection in Poisson regression with high dimensional data has been widely used in recent years. we proposed in this paper using a penalty function that depends on a function named a penalty. An Atan estimator was compared with Lasso and adaptive lasso. A simulation and application show that an Atan estimator has the advantage in the estimation of coefficient and variables selection.
The aim of this research is to determine the most important and main factors that lead to Preeclampsia. It is also about finding suitable solutions to eradicate these factors and avoid them in order to prevent getting Preeclampsia. To achieve this, a case study sample of (40) patients from Medical City - Oncology Teaching Hospital was used to collect data by a questionnaire which contained (17) reasons to be investigated. The statistical package (SPSS) was used to compare the results of the data analysis through two methods (Radial Bases Function Network) and (Factorial Analysis). Important results were obtained, the two methods determined the same factors that could represent the direct reason which causes Preecla
... Show MoreGross domestic product (GDP) is an important measure of the size of the economy's production. Economists use this term to determine the extent of decline and growth in the economies of countries. It is also used to determine the order of countries and compare them to each other. The research aims at describing and analyzing the GDP during the period from 1980 to 2015 and for the public and private sectors and then forecasting GDP in subsequent years until 2025. To achieve this goal, two methods were used: linear and nonlinear regression. The second method in the time series analysis of the Box-Jenkins models and the using of statistical package (Minitab17), (GRETLW32)) to extract the results, and then comparing the two methods, T
... Show MoreA group of acceptance sampling to testing the products was designed when the life time of an item follows a log-logistics distribution. The minimum number of groups (k) required for a given group size and acceptance number is determined when various values of Consumer’s Risk and test termination time are specified. All the results about these sampling plan and probability of acceptance were explained with tables.
Microwave heating is caused by the ability of the materials to absorb microwave energy and convert it to heat. The aim of this study is to know the difference that will occur when heat treating the high strength aluminum alloys AA7075-T73 in a microwave furnace within different mediums (dry and acidic solution) at different times (30 and 60) minutes, on mechanical properties and fatigue life. The experimental results of microwave furnace heat energy showed that there were variations in the mechanical properties (ultimate stress, yielding stress, fatigue strength, fatigue life and hardness) with the variation in mediums and duration times when compared with samples without treatment. The ultimate stress, yielding stress and fatigue streng
... Show MoreOne of the most difficult issues in the history of communication technology is the transmission of secure images. On the internet, photos are used and shared by millions of individuals for both private and business reasons. Utilizing encryption methods to change the original image into an unintelligible or scrambled version is one way to achieve safe image transfer over the network. Cryptographic approaches based on chaotic logistic theory provide several new and promising options for developing secure Image encryption methods. The main aim of this paper is to build a secure system for encrypting gray and color images. The proposed system consists of two stages, the first stage is the encryption process, in which the keys are genera
... Show MoreToday, problems of spatial data integration have been further complicated by the rapid development in communication technologies and the increasing amount of available data sources on the World Wide Web. Thus, web-based geospatial data sources can be managed by different communities and the data themselves can vary in respect to quality, coverage, and purpose. Integrating such multiple geospatial datasets remains a challenge for geospatial data consumers. This paper concentrates on the integration of geometric and classification schemes for official data, such as Ordnance Survey (OS) national mapping data, with volunteered geographic information (VGI) data, such as the data derived from the OpenStreetMap (OSM) project. Useful descriptions o
... Show MoreThis study involved the treatment of textile wastewater contaminated with direct blue 15 dye (DB15) using a heterogeneous photo-Fenton-like process. Bimetallic iron/copper nanoparticles loaded on bentonite clay were used as heterogeneous catalysts and prepared via liquid-phase reduction method using eucalyptus leaves extract (E-Fe/Cu@BNPs). Characterization methods were applied to resultant particles (NPs), including SEM, BET, and FTIR techniques. The prepared NPs were found with porous and spherical shapes with a specific surface area of particles was 28.589 m2/g. The effect of main parameters on the photo-Fenton-like degradation of DB15 was investigated through batch and continuous fixed-bed systems. In batch mode, pH, H2O2 dosage, DB15 c
... Show More