Purpose: The research aims to estimate models representing phenomena that follow the logic of circular (angular) data, accounting for the 24-hour periodicity in measurement. Theoretical framework: The regression model is developed to account for the periodic nature of the circular scale, considering the periodicity in the dependent variable y, the explanatory variables x, or both. Design/methodology/approach: Two estimation methods were applied: a parametric model, represented by the Simple Circular Regression (SCR) model, and a nonparametric model, represented by the Nadaraya-Watson Circular Regression (NW) model. The analysis used real data from 50 patients at Al-Kindi Teaching Hospital in Baghdad. Findings: The Mean Circular Error (MCE) criterion was used to compare the two models, leading to the conclusion that the Nadaraya-Watson (NW) circular model outperformed the parametric model in estimating the parameters of the circular regression model. Research, Practical & Social Implications: The recommendation emphasized using the Nadaraya-Watson nonparametric smoothing method to capture the nonlinearity in the data. Originality/value: The results indicated that the Nadaraya-Watson circular model (NW) outperformed the parametric model. Paper type Research paper.
Net pay is one of the most important parameters used in determining initial oil in place of a reservoir. It can be delineated through the using of limiting values of the petrophysical properties of the reservoir. Those limiting values are named as the cutoff. This paper provides an insight into the application of regression line method in estimating porosity, clay volume and water saturation cutoff values in Mishrif reservoir/ Missan oil fields. The study included 29 wells distributed in seven oilfields of Halfaya, Buzurgan, Dujaila, Noor, Fauqi, Amara and Kumait.
This study is carried out by applying two types of linear regressions: Least square and Reduce Major Axis Regression.
The Mishrif formation was
... Show MoreImaging by Ultrasound (US) is an accurate and useful modality for the assessment of gestational age (GA), estimation fetal weight, and monitoring the fetal growth during pregnancy, is a routine part of prenatal care, and that can greatly impact obstetric management. Estimation of GA is important in obstetric care, making appropriate management decisions requires accurate appraisal of GA. Accurate GA estimation may assist obstetricians in appropriately counseling women who are at risk of a preterm delivery about likely neonatal outcomes, and it is essential in the evaluation of the fetal growth and detection of intrauterine growth restriction. There are many formulas are used to estimate fetal GA in the world, but it's not specify fo
... Show MoreAbstract:
In this research we discussed the parameter estimation and variable selection in Tobit quantile regression model in present of multicollinearity problem. We used elastic net technique as an important technique for dealing with both multicollinearity and variable selection. Depending on the data we proposed Bayesian Tobit hierarchical model with four level prior distributions . We assumed both tuning parameter are random variable and estimated them with the other unknown parameter in the model .Simulation study was used for explain the efficiency of the proposed method and then we compared our approach with (Alhamzwi 2014 & standard QR) .The result illustrated that our approach
... Show MoreThe use of Bayesian approach has the promise of features indicative of regression analysis model classification tree to take advantage of the above information by, and ensemble trees for explanatory variables are all together and at every stage on the other. In addition to obtaining the subsequent information at each node in the construction of these classification tree. Although bayesian estimates is generally accurate, but it seems that the logistic model is still a good competitor in the field of binary responses through its flexibility and mathematical representation. So is the use of three research methods data processing is carried out, namely: logistic model, and model classification regression tree, and bayesian regression tree mode
... Show MoreCrime is considered as an unlawful activity of all kinds and it is punished by law. Crimes have an impact on a society's quality of life and economic development. With a large rise in crime globally, there is a necessity to analyze crime data to bring down the rate of crime. This encourages the police and people to occupy the required measures and more effectively restricting the crimes. The purpose of this research is to develop predictive models that can aid in crime pattern analysis and thus support the Boston department's crime prevention efforts. The geographical location factor has been adopted in our model, and this is due to its being an influential factor in several situations, whether it is traveling to a specific area or livin
... Show MorePoverty phenomenon is very substantial topic that determines the future of societies and governments and the way that they deals with education, health and economy. Sometimes poverty takes multidimensional trends through education and health. The research aims at studying multidimensional poverty in Iraq by using panelized regression methods, to analyze Big Data sets from demographical surveys collected by the Central Statistical Organization in Iraq. We choose classical penalized regression method represented by The Ridge Regression, Moreover; we choose another penalized method which is the Smooth Integration of Counting and Absolute Deviation (SICA) to analyze Big Data sets related to the different poverty forms in Iraq. Euclidian Distanc
... Show MoreThis research deals with unusual approach for analyzing the Simple Linear Regression via Linear Programming by Two - phase method, which is known in Operations Research: “O.R.”. The estimation here is found by solving optimization problem when adding artificial variables: Ri. Another method to analyze the Simple Linear Regression is introduced in this research, where the conditional Median of (y) was taken under consideration by minimizing the Sum of Absolute Residuals instead of finding the conditional Mean of (y) which depends on minimizing the Sum of Squared Residuals, that is called: “Median Regression”. Also, an Iterative Reweighted Least Squared based on the Absolute Residuals as weights is performed here as another method to
... Show MoreThis paper considers and proposes new estimators that depend on the sample and on prior information in the case that they either are equally or are not equally important in the model. The prior information is described as linear stochastic restrictions. We study the properties and the performances of these estimators compared to other common estimators using the mean squared error as a criterion for the goodness of fit. A numerical example and a simulation study are proposed to explain the performance of the estimators.
Abstract The wavelet shrink estimator is an attractive technique when estimating the nonparametric regression functions, but it is very sensitive in the case of a correlation in errors. In this research, a polynomial model of low degree was used for the purpose of addressing the boundary problem in the wavelet reduction in addition to using flexible threshold values in the case of Correlation in errors as it deals with those transactions at each level separately, unlike the comprehensive threshold values that deal with all levels simultaneously, as (Visushrink) methods, (False Discovery Rate) method, (Improvement Thresholding) and (Sureshrink method), as the study was conducted on real monthly data represented in the rates of theft crimes f
... Show More