Purpose: The research aims to estimate models representing phenomena that follow the logic of circular (angular) data, accounting for the 24-hour periodicity in measurement. Theoretical framework: The regression model is developed to account for the periodic nature of the circular scale, considering the periodicity in the dependent variable y, the explanatory variables x, or both. Design/methodology/approach: Two estimation methods were applied: a parametric model, represented by the Simple Circular Regression (SCR) model, and a nonparametric model, represented by the Nadaraya-Watson Circular Regression (NW) model. The analysis used real data from 50 patients at Al-Kindi Teaching Hospital in Baghdad. Findings: The Mean Circular Error (MCE) criterion was used to compare the two models, leading to the conclusion that the Nadaraya-Watson (NW) circular model outperformed the parametric model in estimating the parameters of the circular regression model. Research, Practical & Social Implications: The recommendation emphasized using the Nadaraya-Watson nonparametric smoothing method to capture the nonlinearity in the data. Originality/value: The results indicated that the Nadaraya-Watson circular model (NW) outperformed the parametric model. Paper type Research paper.
Multiple linear regressions are concerned with studying and analyzing the relationship between the dependent variable and a set of explanatory variables. From this relationship the values of variables are predicted. In this paper the multiple linear regression model and three covariates were studied in the presence of the problem of auto-correlation of errors when the random error distributed the distribution of exponential. Three methods were compared (general least squares, M robust, and Laplace robust method). We have employed the simulation studies and calculated the statistical standard mean squares error with sample sizes (15, 30, 60, 100). Further we applied the best method on the real experiment data representing the varieties of
... Show MoreIn the current paradigms of information technology, cloud computing is the most essential kind of computer service. It satisfies the need for high-volume customers, flexible computing capabilities for a range of applications like as database archiving and business analytics, and the requirement for extra computer resources to provide a financial value for cloud providers. The purpose of this investigation is to assess the viability of doing data audits remotely inside a cloud computing setting. There includes discussion of the theory behind cloud computing and distributed storage systems, as well as the method of remote data auditing. In this research, it is mentioned to safeguard the data that is outsourced and stored in cloud serv
... Show MoreMaulticollinearity is a problem that always occurs when two or more predictor variables are correlated with each other. consist of the breach of one basic assumptions of the ordinary least squares method with biased estimates results, There are several methods which are proposed to handle this problem including the method To address a problem and method To address a problem , In this research a comparisons are employed between the biased method and unbiased method with Bayesian using Gamma distribution method addition to Ordinary Least Square metho
... Show MoreIn this paper, the process of comparison between the tree regression model and the negative binomial regression. As these models included two types of statistical methods represented by the first type "non parameter statistic" which is the tree regression that aims to divide the data set into subgroups, and the second type is the "parameter statistic" of negative binomial regression, which is usually used when dealing with medical data, especially when dealing with large sample sizes. Comparison of these methods according to the average mean squares error (MSE) and using the simulation of the experiment and taking different sample
... Show MorePetrophysical properties including volume of shale, porosity and water saturation are significance parameters for petroleum companies in evaluating the reservoirs and determining the hydrocarbon zones. These can be achieved through conventional petrophysical calculations from the well logs data such as gamma ray, sonic, neutron, density and deep resistivity. The well logging operations of the targeted limestone Mishrif reservoirs in Ns-X Well, Nasiriya Oilfield, south of Iraq could not be done due to some problems related to the well condition. The gamma ray log was the only recorded log through the cased borehole. Therefore, evaluating the reservoirs and estimating the perforation zones has not performed and the drilled well was
... Show MoreBackground: The study's objective was to estimate the effects of radiation on testosterone-related hormones and blood components in prostate cancer patients. N Materials and Method: This study aims to investigate the effects of radiation on 20 male prostate cancer patients at the Middle Euphrates Oncology Centre. Blood samples were collected before and after radiation treatment, with a total dose of 60- 70 Gy, The blood parameters were analyzed. The hospital laboratory conducted the blood analysis using an analyzer (Diagon D-cell5D) to test blood components before and after radiation. Hormonal examinations included testosterone levels, using the VIDASR 30 for Multiparametric immunoassay system Results: The study assessed the socio-demogra
... Show MoreMerging biometrics with cryptography has become more familiar and a great scientific field was born for researchers. Biometrics adds distinctive property to the security systems, due biometrics is unique and individual features for every person. In this study, a new method is presented for ciphering data based on fingerprint features. This research is done by addressing plaintext message based on positions of extracted minutiae from fingerprint into a generated random text file regardless the size of data. The proposed method can be explained in three scenarios. In the first scenario the message was used inside random text directly at positions of minutiae in the second scenario the message was encrypted with a choosen word before ciphering
... Show MoreCompaction of triticale grain with three moisture contents (8%, 12%, and 16% wet basis) was measured at five applied pressures (0, 7, 14, 34, and 55 kPa). Bulk density increased with increasing pressure for all moisture contents and was significantly (p < 0.0001) dependent on both moisture content and applied pressure. A Verhulst logistic equation was found to model the changes in bulk density of triticale grain with R2 of 0.986. The model showed similar beha
Field experiment conducted to measured Slippage, Effective field capacity, Field Efficiency, Soil Volume Disturbed and Specific Productivity Tillage in silt clay loam soil with depth 18 cm in Baghdad- Iraq. Split – split plot design under randomized complete block design with three replications using Least Significant Design 5 % was used. Three factor used in this experiment included Two types of plows included Chisel and Disk plows which represented main plot , Three Tires Inflation Pressure was second factor included 1.1 ,1.8 and 2.7 Bar, and Three forward speeds of the tillage was third factor included 2.35 , 4.25 and 6.50 km/hr. Result show chisel plow recorded best parameters performance