Purpose: The research aims to estimate models representing phenomena that follow the logic of circular (angular) data, accounting for the 24-hour periodicity in measurement. Theoretical framework: The regression model is developed to account for the periodic nature of the circular scale, considering the periodicity in the dependent variable y, the explanatory variables x, or both. Design/methodology/approach: Two estimation methods were applied: a parametric model, represented by the Simple Circular Regression (SCR) model, and a nonparametric model, represented by the Nadaraya-Watson Circular Regression (NW) model. The analysis used real data from 50 patients at Al-Kindi Teaching Hospital in Baghdad. Findings: The Mean Circular Erro
... Show MoreThrough recent years many researchers have developed methods to estimate the self-similarity and long memory parameter that is best known as the Hurst parameter. In this paper, we set a comparison between nine different methods. Most of them use the deviations slope to find an estimate for the Hurst parameter like Rescaled range (R/S), Aggregate Variance (AV), and Absolute moments (AM), and some depend on filtration technique like Discrete Variations (DV), Variance versus level using wavelets (VVL) and Second-order discrete derivative using wavelets (SODDW) were the comparison set by a simulation study to find the most efficient method through MASE. The results of simulation experiments were shown that the performance of the meth
... Show MoreIn some cases, researchers need to know the causal effect of the treatment in order to know the extent of the effect of the treatment on the sample in order to continue to give the treatment or stop the treatment because it is of no use. The local weighted least squares method was used to estimate the parameters of the fuzzy regression discontinuous model, and the local polynomial method was used to estimate the bandwidth. Data were generated with sample sizes (75,100,125,150 ) in repetition 1000. An experiment was conducted at the Innovation Institute for remedial lessons in 2021 for 72 students participating in the institute and data collection. Those who used the treatment had an increase in their score after
... Show MoreIn this paper, Bayes estimators for the shape and scale parameters of Gamma distribution under the Entropy loss function have been obtained, assuming Gamma and Exponential priors for the shape and scale parameters respectively. Moment, Maximum likelihood estimators and Lindley’s approximation have been used effectively in Bayesian estimation. Based on Monte Carlo simulation method, those estimators are compared depending on the mean squared errors (MSE’s). The results show that, the performance of the Bayes estimator under Entropy loss function is better than other estimates in all cases.
It is the regression analysis is the foundation stone of knowledge of statistics , which mostly depends on the ordinary least square method , but as is well known that the way the above mentioned her several conditions to operate accurately and the results can be unreliable , add to that the lack of certain conditions make it impossible to complete the work and analysis method and among those conditions are the multi-co linearity problem , and we are in the process of detected that problem between the independent variables using farrar –glauber test , in addition to the requirement linearity data and the lack of the condition last has been resorting to the
... Show MoreThe use of non-parametric models and subsequent estimation methods requires that many of the initial conditions that must be met to represent those models of society under study are appropriate, prompting researchers to look for more flexible models, which are represented by non-parametric models
In this study, the most important and most widespread estimations of the estimation of the nonlinear regression function were investigated using Nadaraya-Watson and Regression Local Ploynomial, which are one of the types of non-linear
... Show MoreThe region-based association analysis has been proposed to capture the collective behavior of sets of variants by testing the association of each set instead of individual variants with the disease. Such an analysis typically involves a list of unphased multiple-locus genotypes with potentially sparse frequencies in cases and controls. To tackle the problem of the sparse distribution, a two-stage approach was proposed in literature: In the first stage, haplotypes are computationally inferred from genotypes, followed by a haplotype coclassification. In the second stage, the association analysis is performed on the inferred haplotype groups. If a haplotype is unevenly distributed between the case and control samples, this haplotype is labeled
... Show MoreIn this paper, we introduce three robust fuzzy estimators of a location parameter based on Buckley’s approach, in the presence of outliers. These estimates were compared using the variance of fuzzy numbers criterion, all these estimates were best of Buckley’s estimate. of these, the fuzzy median was the best in the case of small and medium sample size, and in large sample size, the fuzzy trimmed mean was the best.
Survival analysis is one of the types of data analysis that describes the time period until the occurrence of an event of interest such as death or other events of importance in determining what will happen to the phenomenon studied. There may be more than one endpoint for the event, in which case it is called Competing risks. The purpose of this research is to apply the dynamic approach in the analysis of discrete survival time in order to estimate the effect of covariates over time, as well as modeling the nonlinear relationship between the covariates and the discrete hazard function through the use of the multinomial logistic model and the multivariate Cox model. For the purpose of conducting the estimation process for both the discrete
... Show MoreAbstract
Bivariate time series modeling and forecasting have become a promising field of applied studies in recent times. For this purpose, the Linear Autoregressive Moving Average with exogenous variable ARMAX model is the most widely used technique over the past few years in modeling and forecasting this type of data. The most important assumptions of this model are linearity and homogenous for random error variance of the appropriate model. In practice, these two assumptions are often violated, so the Generalized Autoregressive Conditional Heteroscedasticity (ARCH) and (GARCH) with exogenous varia
... Show MoreIn this paper, compared eight methods for generating the initial value and the impact of these methods to estimate the parameter of a autoregressive model, as was the use of three of the most popular methods to estimate the model and the most commonly used by researchers MLL method, Barg method and the least squares method and that using the method of simulation model first order autoregressive through the design of a number of simulation experiments and the different sizes of the samples.
Use of lower squares and restricted boxes
In the estimation of the first-order self-regression parameter
AR (1) (simulation study)
تم في هذا البحث دراسة انموذج متعدد المستوى (انموذج التجميع الجزئي) الذي يعد احد اهم النماذج واسعة الاستعمال والتطبيق في تحليل البيانات التي تتصف بكون المشاهدات فيها تأخذ شكلاً هرمياً او هيكلياً, اذ تم استعمال نماذج التجميع الجزئي وتم تقدير معلمات نماذج التجميع الجزئي (الثابتة والعشوائية) وذلك باستعمال طريقة الامكان الاعظم الكاملة FML وتم اجراء مقارنة بين افضلية هذه النماذج في الجانب التطبيقي الذي تضمن ال
... Show MoreThis paper assesses the impact of changes and fluctuations in bank deposits on the money supply in Iraq. Employing the research constructs an Error Correction Model (ECM) using monthly time series data from 2010 to 2015. The analysis begins with the Phillips-Perron unit root test to ascertain the stationarity of the time series and the Engle and Granger cointegration test to examine the existence of a long-term relationship. Nonparametric regression functions are estimated using two methods: Smoothing Spline and M-smoothing. The results indicate that the M-smoothing approach is the most effective, achieving the shortest adjustment period and the highest adjustment ratio for short-term disturbances, thereby facilitating a return
... Show Moreاستعمال خوارزمية سرب الطيور لحل نماذج صفوف الانتظار مع تطبيق عملي
أن صفة التغير المتسارع في نمط الحياة ولّد مبدأ اللايقين عند إتخاذ القرارات المالية لأي ظاهرة عموماً أو نشاط إقتصادي على وجه الخصوص. وهذا يتطلب الأستعانة بالأدوات الأحصائية كمنهج علمي يساعد في وصفها وتحليلها كمياً ومن ثم التنبؤ بها مستقبلاً كمحاولة لسبر غور اللايقين الذي يكتنف المستقبل كمجهول يتوجس منه الجميع. وقد أصبح متخذ القرار الأستثماري أو صاحب رأس المال وغيرهما من المضاربين والمتعاملين في الاسواق الما
... Show MoreThis Book is intended to be textbook studied for undergraduate course in multivariate analysis. This book is designed to be used in semester system. In order to achieve the goals of the book, it is divided into the following chapters (as done in the first edition 2019). Chapter One introduces matrix algebra. Chapter Two devotes to Linear Equation System Solution with quadratic forms, Characteristic roots & vectors. Chapter Three discusses Partitioned Matrices and how to get Inverse, Jacobi and Hessian matrices. Chapter Four deals with Multivariate Normal Distribution (MVN). Chapter Five concern with Joint, Marginal and Conditional Normal Distribution, independency and correlations. While the revised new chapters have been added (as the curr
... Show MoreIn this paper, some commonly used hierarchical cluster techniques have been compared. A comparison was made between the agglomerative hierarchical clustering technique and the k-means technique, which includes the k-mean technique, the variant K-means technique, and the bisecting K-means, although the hierarchical cluster technique is considered to be one of the best clustering methods. It has a limited usage due to the time complexity. The results, which are calculated based on the analysis of the characteristics of the cluster algorithms and the nature of the data, showed that the bisecting K-means technique is the best compared to the rest of the other methods used.
Multiple linear regressions are concerned with studying and analyzing the relationship between the dependent variable and a set of explanatory variables. From this relationship the values of variables are predicted. In this paper the multiple linear regression model and three covariates were studied in the presence of the problem of auto-correlation of errors when the random error distributed the distribution of exponential. Three methods were compared (general least squares, M robust, and Laplace robust method). We have employed the simulation studies and calculated the statistical standard mean squares error with sample sizes (15, 30, 60, 100). Further we applied the best method on the real experiment data representing the varieties of
... Show MoreStatistical methods of forecasting have applied with the intention of constructing a model to predict the number of the old aged people in retirement homes in Iraq. They were based on the monthly data of old aged people in Baghdad and the governorates except for the Kurdistan region from 2016 to 2019. Using Box-Jenkins methodology, the stationarity of the series was examined. The appropriate model order was determined, the parameters were estimated, the significance was tested, adequacy of the model was checked, and then the best model of prediction was used. The best model for forecasting according to criteria of (Normalized BIC, MAPE, RMSE) is ARIMA (0, 1, 2).
تناول البحث نموذج الانحدار الذاتي الحيزي ونموذج الخطأ الحيزي في محاولة لتقديم دليل عملي يوضح اهمية التحليل الحيزي، مع التركيز بصفة خاصة على أهمية استعمال نماذج الانحدار الحيزي والتي تضم كل منها الاعتمادية الحيزية التي يتم أختبار وجودها من عدمه بأختبار موران، وان تجاهل هذه الاعتمادية قد يؤدي إلى ضياع معلومات مهمة عن تلك الظاهرة تنعكس في نهاية المطاف على قوة تقدير المؤشر الإحصائي المستخرج، تعدّ هذه الن
... Show MoreThe issue of penalized regression model has received considerable critical attention to variable selection. It plays an essential role in dealing with high dimensional data. Arctangent denoted by the Atan penalty has been used in both estimation and variable selection as an efficient method recently. However, the Atan penalty is very sensitive to outliers in response to variables or heavy-tailed error distribution. While the least absolute deviation is a good method to get robustness in regression estimation. The specific objective of this research is to propose a robust Atan estimator from combining these two ideas at once. Simulation experiments and real data applications show that the proposed LAD-Atan estimator
... Show MoreIn this research, one of the nonlinear regression models is studied, which is BoxBOD, which is characterized by nonlinear parameters, as the difficulty of this model lies in estimating its parameters for being nonlinear, as its parameters were estimated by some traditional methods, namely the method of non-linear least squares and the greatest possible method and one of the methods of artificial intelligence, it is a genetic algorithm, as this algorithm was based on two types of functions, one of which is the function of the sum of squares of error and the second is the function of possibility. For comparison between the methods used in the research, the comparison scale was based on the average error squares, and for the purpose of data ge
... Show Moreفي كثير من الأحيان يفشل تحليل المربعات الصغرى (LS) تماماً في حالة وجود قيم شاذة في الظواهر المدروسة، اذ ستفقد OLS خصائصها ومن ثم تفقد صفة المقدر الخطي الجيد Beast Linear Unbiased Estimator (BLUE) لِما تسببه الشواذ Outliers من تأثير سيئ علـى نتـائج التحليـل الاحـصائي للبيانـات اذ أن وجودها يؤدي الى إرباك كبير في تحليل البيانات في حالة إستخدام الطرائق التقليدية، ولعلاج هذه المشكلة تم تطوير أساليب إحصائية جديدة بحيث لا تتأثر بالقي
... Show MoreThis book includes four main chapters: 1. Indefinite Integral. 2. Methods of Integration. 3. Definite Integral. 4. Multiple Integral. In addition to many examples and exercises for the purpose of acquiring the student's ability to think correctly in solving mathematical questions.
Artificial Neural networks (ANN) are powerful and effective tools in time-series applications. The first aim of this paper is to diagnose better and more efficient ANN models (Back Propagation, Radial Basis Function Neural networks (RBF), and Recurrent neural networks) in solving the linear and nonlinear time-series behavior. The second aim is dealing with finding accurate estimators as the convergence sometimes is stack in the local minima. It is one of the problems that can bias the test of the robustness of the ANN in time series forecasting. To determine the best or the optimal ANN models, forecast Skill (SS) employed to measure the efficiency of the performance of ANN models. The mean square error and
... Show More