Mixed-effects conditional logistic regression is evidently more effective in the study of qualitative differences in longitudinal pollution data as well as their implications on heterogeneous subgroups. This study seeks that conditional logistic regression is a robust evaluation method for environmental studies, thru the analysis of environment pollution as a function of oil production and environmental factors. Consequently, it has been established theoretically that the primary objective of model selection in this research is to identify the candidate model that is optimal for the conditional design. The candidate model should achieve generalizability, goodness-of-fit, parsimony and establish equilibrium between bias and variability. In the practical sphere it is however more realistic to capture the most significant parameters of the research design through the best fitted candidate model for this research. Simulation studies demonstrate that the mixed-effects conditional logistic regression is more accurate for pollution studies, with fixed-effects conditional logistic regression models potentially generating flawed conclusions. This is because mixed-effects conditional logistic regression provides detailed insights on clusters that were largely overlooked by fixed-effects conditional logistic regression.
This study aim to identify the concept of web based information systems since its one of the important topics that is usually omitted by our organizations, in addition to, designing a web based information system in order to manage the customers data of Al- Rasheed bank, as a unified information system that is specialized to the banking deals of the customers with the bank, and providing a suggested model to apply the virtual private network as a tool that is to protect the transmitted data through the web based information system.
This study is considered important because it deals with one of the vital topics nowadays, namely: how to make it possible to use a distributed informat
... Show MoreThe purpose of this article is to improve and minimize noise from the signal by studying wavelet transforms and showing how to use the most effective ones for processing and analysis. As both the Discrete Wavelet Transformation method was used, we will outline some transformation techniques along with the methodology for applying them to remove noise from the signal. Proceeds based on the threshold value and the threshold functions Lifting Transformation, Wavelet Transformation, and Packet Discrete Wavelet Transformation. Using AMSE, A comparison was made between them , and the best was selected. When the aforementioned techniques were applied to actual data that was represented by each of the prices, it became evident that the lift
... Show MoreThe objective of this research is to know the extent to which Iraqi and Arab companies apply the criteria of accounting for sustainability and disclosure, as well as to analyze the content of the annual financial reports of the companies listed in the financial market to determine their compliance with the Sustainability Accounting Standards Board )SASB(. Annual Report The commitment of telecommunications companies to implement sustainability issues related to the standard of telecommunications services reached a general average of (54%) for the sample of the research sample. This means that there is a degree of admissibility in applying the standard. As well as the highest level of reporting to the criterion of the (Jordan Telec
... Show More
Communication attribution is a condition of the validity of the hadeeth and that each narrator heard from his Sheik.There are some of the narrators who said to hear who told him and his contemporary, and this narrator is also innocent of the stigma of fraud, but this hearing has no truth.
Abstract The wavelet shrink estimator is an attractive technique when estimating the nonparametric regression functions, but it is very sensitive in the case of a correlation in errors. In this research, a polynomial model of low degree was used for the purpose of addressing the boundary problem in the wavelet reduction in addition to using flexible threshold values in the case of Correlation in errors as it deals with those transactions at each level separately, unlike the comprehensive threshold values that deal with all levels simultaneously, as (Visushrink) methods, (False Discovery Rate) method, (Improvement Thresholding) and (Sureshrink method), as the study was conducted on real monthly data represented in the rates of theft crimes f
... Show MoreThe estimation of the regular regression model requires several assumptions to be satisfied such as "linearity". One problem occurs by partitioning the regression curve into two (or more) parts and then joining them by threshold point(s). This situation is regarded as a linearity violation of regression. Therefore, the multiphase regression model is received increasing attention as an alternative approach which describes the changing of the behavior of the phenomenon through threshold point estimation. Maximum likelihood estimator "MLE" has been used in both model and threshold point estimations. However, MLE is not resistant against violations such as outliers' existence or in case of the heavy-tailed error distribution. The main goal of t
... Show MoreThis study investigates asset returns within the Iraq Stock Exchange by employing both the Fama-MacBeth regression model and the Fama-French three-factor model. The research involves the estimation of cross-sectional regressions wherein model parameters are subject to temporal variation, and the independent variables function as proxies. The dataset comprises information from the first quarter of 2010 to the first quarter of 2024, encompassing 22 publicly listed companies across six industrial sectors. The study explores methodological advancements through the application of the Single Index Model (SIM) and Kernel Weighted Regression (KWR) in both time series and cross-sectional analyses. The SIM outperformed the K
... Show MoreA new distribution, the Epsilon Skew Gamma (ESΓ ) distribution, which was first introduced by Abdulah [1], is used on a near Gamma data. We first redefine the ESΓ distribution, its properties, and characteristics, and then we estimate its parameters using the maximum likelihood and moment estimators. We finally use these estimators to fit the data with the ESΓ distribution
In this paper, we have investigated some of the most recent energy efficient routing protocols for wireless body area networks. This technology has seen advancements in recent times where wireless sensors are injected in the human body to sense and measure body parameters like temperature, heartbeat and glucose level. These tiny wireless sensors gather body data information and send it over a wireless network to the base station. The data measurements are examined by the doctor or physician and the suitable cure is suggested. The whole communication is done through routing protocols in a network environment. Routing protocol consumes energy while helping non-stop communic
... Show More