The study of economic growth indicators is of fundamental importance in estimating the effectiveness of economic development plans, as well as the great role it plays in determining appropriate economic policies in order to optimally use the factors that lead to the dynamics of growth in Iraq, especially during a certain period of time. The gross domestic product (GDP) at current prices), which is considered a part of the national accounts, which is considered as an integrated dynamic of statistics that produces in front of policy makers the possibility of determining whether the economy is witnessing a state of expansion or evaluating economic activity and its efficiency in order to reach the size of the overall economy. The research aims
... Show MoreThe need for detection and investigation of the causes of pollution of the marshes and submit a statistical study evaluated accurately and submitted to the competent authorities and to achieve this goal was used to analyze the factorial analysis and then obtained the results from this analysis from a sample selected from marsh water pollutants which they were: (Electrical Conductivity: EC, Power of Hydrogen: PH, Temperature: T, Turbidity: TU, Total Dissolved Solids: TDS, Dissolved Oxygen: DO). The size of sample (44) sites has been withdrawn and examined in the laboratories of the Iraqi Ministry of Environment. By illustrating SPSS program) the results had been obtained. The most important recommendation was to increase the pumping of addit
... Show MoreExamination of skewness makes academics more aware of the importance of accurate statistical analysis. Undoubtedly, most phenomena contain a certain percentage of skewness which resulted to the appearance of what is -called "asymmetry" and, consequently, the importance of the skew normal family . The epsilon skew normal distribution ESN (μ, σ, ε) is one of the probability distributions which provide a more flexible model because the skewness parameter provides the possibility to fluctuate from normal to skewed distribution. Theoretically, the estimation of linear regression model parameters, with an average error value that is not zero, is considered a major challenge due to having difficulties, as no explicit formula to calcula
... Show MoreThis research aims to provide insight into the Spatial Autoregressive Quantile Regression model (SARQR), which is more general than the Spatial Autoregressive model (SAR) and Quantile Regression model (QR) by integrating aspects of both. Since Bayesian approaches may produce reliable estimates of parameter and overcome the problems that standard estimating techniques, hence, in this model (SARQR), they were used to estimate the parameters. Bayesian inference was carried out using Markov Chain Monte Carlo (MCMC) techniques. Several criteria were used in comparison, such as root mean squared error (RMSE), mean absolute percentage error (MAPE), and coefficient of determination (R^2). The application was devoted on dataset of poverty rates acro
... Show MoreThis Book is intended to be a textbook studied for undergraduate course in financial statistics/ department of Financial Sciences and Banking. This book is designed to be used in semester system. To achieve the goals of the book, it is divided into the following chapters. Chapter one introduces basic concepts. Chapter two devotes to frequency distribution and data representation. Chapter three discusses central tendency measures (all types of means, mode, and median). Chapter four deals with dispersion Measures (standard deviation, variance, and coefficient of variation). Chapter five concerned with correlation and regression analysis. While chapter six concerned with testing Hypotheses (One population mean test, Two "independent" populati
... Show MoreThis Book is the second edition that intended to be textbook studied for undergraduate/ postgraduate course in mathematical statistics. In order to achieve the goals of the book, it is divided into the following chapters. Chapter One introduces events and probability review. Chapter Two devotes to random variables in their two types: discrete and continuous with definitions of probability mass function, probability density function and cumulative distribution function as well. Chapter Three discusses mathematical expectation with its special types such as: moments, moment generating function and other related topics. Chapter Four deals with some special discrete distributions: (Discrete Uniform, Bernoulli, Binomial, Poisson, Geometric, Neg
... Show Moreالمستخلص يهدف هذا البحث الى تجاوز مشكلة البعدية من خلال طرائق الانحدار اللامعلمي والتي تعمل على تقليل جذر متوسط الخطأ التربيعي (RMSE) , أذ تم استعمال طريقة انحدار الاسقاطات المتلاحقة (PPR) ,والتي تعتبر احدى طرائق اختزال الابعاد التي تعمل على تجاوز مشكلة البعدية (curse of dimensionality) , وان طريقة (PPR) من التقنيات الاحصائية التي تهتم بأيجاد الاسقاطات الاكثر أهمية في البيانات المتعددة الابعاد , ومع ايجاد كل اسقاط
... Show MoreA binary stream cipher cryptosystem can be used to encrypt/decrypt many types of digital files, especially those can be considered huge data files like images. To guarantee that the encryption or decryption processes need a reasonable time to encrypt/decrypt the images, so we have to make the stream cipher key generator that acts quickly without effect in the complexity or randomness of the output key binary sequences. In this paper, we increase the size of the output sequence from binary to digital sequence in the field to obtain byte sequence, then we test this new sequence not only as binary but also -sequence. So we have to test the new output sequence in the new mathematical field. This is done by changing the base of the
... Show MoreAbstract Software-Defined Networking (commonly referred to as SDN) is a newer paradigm that develops the concept of a software-driven network by separating data and control planes. It can handle the traditional network problems. However, this excellent architecture is subjected to various security threats. One of these issues is the distributed denial of service (DDoS) attack, which is difficult to contain in this kind of software-based network. Several security solutions have been proposed recently to secure SDN against DDoS attacks. This paper aims to analyze and discuss machine learning-based systems for SDN security networks from DDoS attack. The results have indicated that the algorithms for machine learning can be used to detect DDoS
... Show MoreThe challenge to incorporate usability evaluation values and practices into agile development process is not only persisting but also systemic. Notable contributions of researchers have attempted to isolate and close the gaps between both fields, with the aim of developing usable software. Due to the current absence of a reference model that specifies where and how usability activities need to be considered in the agile development process. This paper proposes a model for identifying appropriate usability evaluation methods alongside the agile development process. By using this model, the development team can apply usability evaluations at the right time at the right place to get the necessary feedback from the end-user. Verificatio
... Show MoreThe purpose of this paper is to model and forecast the white oil during the period (2012-2019) using volatility GARCH-class. After showing that squared returns of white oil have a significant long memory in the volatility, the return series based on fractional GARCH models are estimated and forecasted for the mean and volatility by quasi maximum likelihood QML as a traditional method. While the competition includes machine learning approaches using Support Vector Regression (SVR). Results showed that the best appropriate model among many other models to forecast the volatility, depending on the lowest value of Akaike information criterion and Schwartz information criterion, also the parameters must be significant. In addition, the residuals
... Show MoreThis paper study two stratified quantile regression models of the marginal and the conditional varieties. We estimate the quantile functions of these models by using two nonparametric methods of smoothing spline (B-spline) and kernel regression (Nadaraya-Watson). The estimates can be obtained by solve nonparametric quantile regression problem which means minimizing the quantile regression objective functions and using the approach of varying coefficient models. The main goal is discussing the comparison between the estimators of the two nonparametric methods and adopting the best one between them
تم في هذا البحث دراسة انموذج متعدد المستوى (انموذج التجميع الجزئي) الذي يعد احد اهم النماذج واسعة الاستعمال والتطبيق في تحليل البيانات التي تتصف بكون المشاهدات فيها تأخذ شكلاً هرمياً او هيكلياً, اذ تم استعمال نماذج التجميع الجزئي وتم تقدير معلمات نماذج التجميع الجزئي (الثابتة والعشوائية) وذلك باستعمال طريقة الامكان الاعظم الكاملة FML وتم اجراء مقارنة بين افضلية هذه النماذج في الجانب التطبيقي الذي تضمن ال
... Show MoreThe estimation of the regular regression model requires several assumptions to be satisfied such as "linearity". One problem occurs by partitioning the regression curve into two (or more) parts and then joining them by threshold point(s). This situation is regarded as a linearity violation of regression. Therefore, the multiphase regression model is received increasing attention as an alternative approach which describes the changing of the behavior of the phenomenon through threshold point estimation. Maximum likelihood estimator "MLE" has been used in both model and threshold point estimations. However, MLE is not resistant against violations such as outliers' existence or in case of the heavy-tailed error distribution. The main goal of t
... Show MoreThe repeated measurement design is called a complete randomized block design for repeated measurement when the subject is given the all different treatments , in this case the subject is considered as a block . Many of nonparametric methods were considered like Friedman test (1937) and Koch test(1969) and Kepner&Robinson test(1988) when the assumption of normal distribution of the data is not satisfied .as well as F test when the assumptions of the analysis of variance is satisfied ,where the observations within blocks are assumed to be equally correlated . The purpose of this paper is to summarize the result of the simulation study for comparing these methods as well as present the suggested
Me
... Show MoreA mixture model is used to model data that come from more than one component. In recent years, it became an effective tool in drawing inferences about the complex data that we might come across in real life. Moreover, it can represent a tremendous confirmatory tool in classification observations based on similarities amongst them. In this paper, several mixture regression-based methods were conducted under the assumption that the data come from a finite number of components. A comparison of these methods has been made according to their results in estimating component parameters. Also, observation membership has been inferred and assessed for these methods. The results showed that the flexible mixture model outperformed the others
... Show MoreIn this paper, the effect of changes in bank deposits on the money supply in Iraq was studied by estimating the error correction model (ECM) for monthly time series data for the period (2010-2015) . The Philips Perron was used to test the stationarity and also we used Engle and Granger to test the cointegration . we used cubic spline and local polynomial estimator to estimate regression function .The result show that local polynomial was better than cubic spline with the first level of cointegration.
This article aim to estimate the Return Stock Rate of the private banking sector, with two banks, by adopting a Partial Linear Model based on the Arbitrage Pricing Model (APT) theory, using Wavelet and Kernel Smoothers. The results have proved that the wavelet method is the best. Also, the results of the market portfolio impact and inflation rate have proved an adversely effectiveness on the rate of return, and direct impact of the money supply.
The aim of this essay is to use a single-index model in developing and adjusting Fama-MacBeth. Penalized smoothing spline regression technique (SIMPLS) foresaw this adjustment. Two generalized cross-validation techniques, Generalized Cross Validation Grid (GGCV) and Generalized Cross Validation Fast (FGCV), anticipated the regular value of smoothing covered under this technique. Due to the two-steps nature of the Fama-MacBeth model, this estimation generated four estimates: SIMPLS(FGCV) - SIMPLS(FGCV), SIMPLS(FGCV) - SIM PLS(GGCV), SIMPLS(GGCV) - SIMPLS(FGCV), SIM PLS(GGCV) - SIM PLS(GGCV). Three-factor Fama-French model—market risk premium, size factor, value factor, and their implication for excess stock returns and portfolio return
... Show MoreAn adaptive fuzzy weighted linear regression model in which the output is based
on the position and entropy of quadruple fuzzy numbers had dealt with. The solution
of the adaptive models is established in terms of the iterative fuzzy least squares by
introducing a new suitable metric which takes into account the types of the influence
of different imprecisions. Furthermore, the applicability of the model is made by
attempting to estimate the fuzzy infant mortality rate in Iraq using a selective set of
inputs.
Abstract
There are many uncertainty sources that may affect the statistical reasoning. However, traditional methods can not deal with all kinds of uncertainty sources, which has led many researchers to develop traditional methods. Studies still exist to this day, making hypotheses to create a common understanding for the purpose of reaching new solutions through the use of new methods that combine traditional and modern theories of sources of uncertainty
The aim of current study was to develop the adaptive fuzzy linear regression model in the case of using inaccurate data as the source of uncertainty. Specifically, the
... Show MoreThis paper deals with, Bayesian estimation of the parameters of Gamma distribution under Generalized Weighted loss function, based on Gamma and Exponential priors for the shape and scale parameters, respectively. Moment, Maximum likelihood estimators and Lindley’s approximation have been used effectively in Bayesian estimation. Based on Monte Carlo simulation method, those estimators are compared in terms of the mean squared errors (MSE’s).
This paper assesses the impact of changes and fluctuations in bank deposits on the money supply in Iraq. Employing the research constructs an Error Correction Model (ECM) using monthly time series data from 2010 to 2015. The analysis begins with the Phillips-Perron unit root test to ascertain the stationarity of the time series and the Engle and Granger cointegration test to examine the existence of a long-term relationship. Nonparametric regression functions are estimated using two methods: Smoothing Spline and M-smoothing. The results indicate that the M-smoothing approach is the most effective, achieving the shortest adjustment period and the highest adjustment ratio for short-term disturbances, thereby facilitating a return
... Show MoreBackground: Hypertension is a major global health concern that increases the risk of cardiovascular disease. Understanding the impact of age and treatment types on blood pressure control is essential for optimizing therapeutic strategies. Aim: This study aims to assess how different treatment types and patient age influence blood pressure control in hypertensive patients. Methodology: A binary logistic regression model was employed to analyze data from 48 patients diagnosed with hypertension. The study investigated the impact of two treatment regimens and patient age on the likelihood of achieving optimal blood pressure levels. The statistical significance of the findings was evaluated using chi-square tests and p-values. Results: T
... Show MoreThe technology of reducing dimensions and choosing variables are very important topics in statistical analysis to multivariate. When two or more of the predictor variables are linked in the complete or incomplete regression relationships, a problem of multicollinearity are occurred which consist of the breach of one basic assumptions of the ordinary least squares method with incorrect estimates results.
There are several methods proposed to address this problem, including the partial least squares (PLS), used to reduce dimensional regression analysis. By using linear transformations that convert a set of variables associated with a high link to a set of new independent variables and unr
... Show Moren this research, several estimators concerning the estimation are introduced. These estimators are closely related to the hazard function by using one of the nonparametric methods namely the kernel function for censored data type with varying bandwidth and kernel boundary. Two types of bandwidth are used: local bandwidth and global bandwidth. Moreover, four types of boundary kernel are used namely: Rectangle, Epanechnikov, Biquadratic and Triquadratic and the proposed function was employed with all kernel functions. Two different simulation techniques are also used for two experiments to compare these estimators. In most of the cases, the results have proved that the local bandwidth is the best for all the types of the kernel boundary func
... Show MoreArtificial Intelligence Algorithms have been used in recent years in many scientific fields. We suggest employing artificial TABU algorithm to find the best estimate of the semi-parametric regression function with measurement errors in the explanatory variables and the dependent variable, where measurement errors appear frequently in fields such as sport, chemistry, biological sciences, medicine, and epidemiological studies, rather than an exact measurement.
The current research creates an overall relative analysis concerning the estimation of Meixner process parameters via the wavelet packet transform. Of noteworthy presentation relevance, it compares the moment method and the wavelet packet estimator for the four parameters of the Meixner process. In this paper, the research focuses on finding the best threshold value using the square root log and modified square root log methods with the wavelet packets in the presence of noise to enhance the efficiency and effectiveness of the denoising process for the financial asset market signal. In this regard, a simulation study compares the performance of moment estimation and wavelet packets for different sample sizes. The results show that wavelet p
... Show More