Purpose: The research aims to estimate models representing phenomena that follow the logic of circular (angular) data, accounting for the 24-hour periodicity in measurement. Theoretical framework: The regression model is developed to account for the periodic nature of the circular scale, considering the periodicity in the dependent variable y, the explanatory variables x, or both. Design/methodology/approach: Two estimation methods were applied: a parametric model, represented by the Simple Circular Regression (SCR) model, and a nonparametric model, represented by the Nadaraya-Watson Circular Regression (NW) model. The analysis used real data from 50 patients at Al-Kindi Teaching Hospital in Baghdad. Findings: The Mean Circular Error (MCE) criterion was used to compare the two models, leading to the conclusion that the Nadaraya-Watson (NW) circular model outperformed the parametric model in estimating the parameters of the circular regression model. Research, Practical & Social Implications: The recommendation emphasized using the Nadaraya-Watson nonparametric smoothing method to capture the nonlinearity in the data. Originality/value: The results indicated that the Nadaraya-Watson circular model (NW) outperformed the parametric model. Paper type Research paper.
The Weibull distribution is considered one of the Type-I Generalized Extreme Value (GEV) distribution, and it plays a crucial role in modeling extreme events in various fields, such as hydrology, finance, and environmental sciences. Bayesian methods play a strong, decisive role in estimating the parameters of the GEV distribution due to their ability to incorporate prior knowledge and handle small sample sizes effectively. In this research, we compare several shrinkage Bayesian estimation methods based on the squared error and the linear exponential loss functions. They were adopted and compared by the Monte Carlo simulation method. The performance of these methods is assessed based on their accuracy and computational efficiency in estimati
... Show MoreIn this paper, we employ the maximum likelihood estimator in addition to the shrinkage estimation procedure to estimate the system reliability (
This research study Blur groups (Fuzzy Sets) which is the perception of the most modern in the application in various practical and theoretical areas and in various fields of life, was addressed to the fuzzy random variable whose value is not real, but the numbers Millbh because it expresses the mysterious phenomena or uncertain with measurements are not assertive. Fuzzy data were presented for binocular test and analysis of variance method of random Fuzzy variables , where this method depends on a number of assumptions, which is a problem that prevents the use of this method in the case of non-realized.
Abstract
The study seeks to use one of the techniques (Data mining) a (Logic regression) on the inherited risk through the use of style financial ratios technical analysis and then apply for financial fraud indicators,Since higher scandals exposed companies and the failure of the audit process has shocked the community and affected the integrity of the auditor and the reason is financial fraud practiced by the companies and not to the discovery of the fraud by the auditor, and this fraud involves intentional act aimed to achieve personal and harm the interests of to others, and doing (administration, staff) we can say that all frauds carried out through the presence of the motives and factors that help th
... Show MoreThis paper presents a hybrid approach for solving null values problem; it hybridizes rough set theory with intelligent swarm algorithm. The proposed approach is a supervised learning model. A large set of complete data called learning data is used to find the decision rule sets that then have been used in solving the incomplete data problem. The intelligent swarm algorithm is used for feature selection which represents bees algorithm as heuristic search algorithm combined with rough set theory as evaluation function. Also another feature selection algorithm called ID3 is presented, it works as statistical algorithm instead of intelligent algorithm. A comparison between those two approaches is made in their performance for null values estima
... Show MoreReferral techniques are normally employed in internet business applications. Existing frameworks prescribe things to a particular client according to client inclinations and former high evaluations. Quite a number of methods, such as cooperative filtering and content-based methodologies, dominate the architectural design of referral frameworks. Many referral schemes are domain-specific and cannot be deployed in a general-purpose setting. This study proposes a two-dimensional (User × Item)-space multimode referral scheme, having an enormous client base but few articles on offer. Additionally, the design of the referral scheme is anchored on the and articles, as expressed by a particular client, and is a combination of affi
... Show MoreThis paper describes the use of microcomputer as a laboratory instrument system. The system is focused on three weather variables measurement, are temperature, wind speed, and wind direction. This instrument is a type of data acquisition system; in this paper we deal with the design and implementation of data acquisition system based on personal computer (Pentium) using Industry Standard Architecture (ISA)bus. The design of this system involves mainly a hardware implementation, and the software programs that are used for testing, measuring and control. The system can be used to display the required information that can be transferred and processed from the external field to the system. A visual basic language with Microsoft foundation cl
... Show More