The aim of this research is to use robust technique by trimming, as the analysis of maximum likelihood (ML) often fails in the case of outliers in the studied phenomenon. Where the (MLE) will lose its advantages because of the bad influence caused by the Outliers. In order to address this problem, new statistical methods have been developed so as not to be affected by the outliers. These methods have robustness or resistance. Therefore, maximum trimmed likelihood: (MTL) is a good alternative to achieve more results. Acceptability and analogies, but weights can be used to increase the efficiency of the resulting capacities and to increase the strength of the estimate using the maximum weighted trimmed likelihood (MWTL). In order to perform t
... Show MoreA large number of researchers had attempted to identify the pattern of the functional relationship between fertility from a side and economic and social characteristics of the population from another, with the strength of effect of each. So, this research aims to monitor and analyze changes in the level of fertility temporally and spatially in recent decades, in addition to estimating fertility levels in Iraq for the period (1977-2011) and then make forecasting to the level of fertility in Iraq at the national level (except for the Kurdistan region), and for the period of (2012-2031). To achieve this goal has been the use of the Lee-Carter model to estimate fertility rates and predictable as well. As this is the form often has been familiar
... Show MoreThe need for detection and investigation of the causes of pollution of the marshes and submit a statistical study evaluated accurately and submitted to the competent authorities and to achieve this goal was used to analyze the factorial analysis and then obtained the results from this analysis from a sample selected from marsh water pollutants which they were: (Electrical Conductivity: EC, Power of Hydrogen: PH, Temperature: T, Turbidity: TU, Total Dissolved Solids: TDS, Dissolved Oxygen: DO). The size of sample (44) sites has been withdrawn and examined in the laboratories of the Iraqi Ministry of Environment. By illustrating SPSS program) the results had been obtained. The most important recommendation was to increase the pumping of addit
... Show MoreThe purpose of this article is to improve and minimize noise from the signal by studying wavelet transforms and showing how to use the most effective ones for processing and analysis. As both the Discrete Wavelet Transformation method was used, we will outline some transformation techniques along with the methodology for applying them to remove noise from the signal. Proceeds based on the threshold value and the threshold functions Lifting Transformation, Wavelet Transformation, and Packet Discrete Wavelet Transformation. Using AMSE, A comparison was made between them , and the best was selected. When the aforementioned techniques were applied to actual data that was represented by each of the prices, it became evident that the lift
... Show MoreAbstract The wavelet shrink estimator is an attractive technique when estimating the nonparametric regression functions, but it is very sensitive in the case of a correlation in errors. In this research, a polynomial model of low degree was used for the purpose of addressing the boundary problem in the wavelet reduction in addition to using flexible threshold values in the case of Correlation in errors as it deals with those transactions at each level separately, unlike the comprehensive threshold values that deal with all levels simultaneously, as (Visushrink) methods, (False Discovery Rate) method, (Improvement Thresholding) and (Sureshrink method), as the study was conducted on real monthly data represented in the rates of theft crimes f
... Show MoreThis paper study two stratified quantile regression models of the marginal and the conditional varieties. We estimate the quantile functions of these models by using two nonparametric methods of smoothing spline (B-spline) and kernel regression (Nadaraya-Watson). The estimates can be obtained by solve nonparametric quantile regression problem which means minimizing the quantile regression objective functions and using the approach of varying coefficient models. The main goal is discussing the comparison between the estimators of the two nonparametric methods and adopting the best one between them
The research aims to identify the impact of using the electronic participatory learning strategy according to internet programs in learning some basic basketball skills for middle first graders according to the curricular course, and the sample of research was selected in the deliberate way of students The first stage of intermediate school.As for the problem of research, the researchers said that there is a weakness in the levels of school students in terms of teaching basketball skills, which prompted the researchers to create appropriate solutions by using a participatory learning strategy.The researchers imposed statistically significant differences between pre and post-test tests, in favor of the post tests individually and in favor of
... Show MoreIn this research, we use fuzzy nonparametric methods based on some smoothing techniques, were applied to real data on the Iraqi stock market especially the data about Baghdad company for soft drinks for the year (2016) for the period (1/1/2016-31/12/2016) .A sample of (148) observations was obtained in order to construct a model of the relationship between the stock prices (Low, high, modal) and the traded value by comparing the results of the criterion (G.O.F.) for three techniques , we note that the lowest value for this criterion was for the K-Nearest Neighbor at Gaussian function .
Poverty phenomenon is very substantial topic that determines the future of societies and governments and the way that they deals with education, health and economy. Sometimes poverty takes multidimensional trends through education and health. The research aims at studying multidimensional poverty in Iraq by using panelized regression methods, to analyze Big Data sets from demographical surveys collected by the Central Statistical Organization in Iraq. We choose classical penalized regression method represented by The Ridge Regression, Moreover; we choose another penalized method which is the Smooth Integration of Counting and Absolute Deviation (SICA) to analyze Big Data sets related to the different poverty forms in Iraq. Euclidian Distanc
... Show MoreA binary stream cipher cryptosystem can be used to encrypt/decrypt many types of digital files, especially those can be considered huge data files like images. To guarantee that the encryption or decryption processes need a reasonable time to encrypt/decrypt the images, so we have to make the stream cipher key generator that acts quickly without effect in the complexity or randomness of the output key binary sequences. In this paper, we increase the size of the output sequence from binary to digital sequence in the field to obtain byte sequence, then we test this new sequence not only as binary but also -sequence. So we have to test the new output sequence in the new mathematical field. This is done by changing the base of the
... Show MoreThe general aim of an experimental design in this paper was to estimate the different treatments effects on the responses by statistical methods. The estimates must be averting biases and the random errors minimized as much as possible. We used multivariate analysis of variance (MANOVA) to analyze design of experiments for several responses. In this paper, we provided three fertilizers (mineral, humic, micro-elements) applied on Yellow Maize experiment. This experiment was conducted by completely randomized design (CRD). We tested four responses (Chlorophyll in paper, total ton / ha, paper area / cm2 and plant height / cm) together to find significant test between them. The partial correlations are between Chlorophyll in paper and total ton
... Show MoreIn this article, we developed a new loss function, as the simplification of linear exponential loss function (LINEX) by weighting LINEX function. We derive a scale parameter, reliability and the hazard functions in accordance with upper record values of the Lomax distribution (LD). To study a small sample behavior performance of the proposed loss function using a Monte Carlo simulation, we make a comparison among maximum likelihood estimator, Bayesian estimator by means of LINEX loss function and Bayesian estimator using square error loss (SE) function. The consequences have shown that a modified method is the finest for valuing a scale parameter, reliability and hazard functions.
This research deals with the use of a number of statistical methods, such as the kernel method, watershed, histogram and cubic spline, to improve the contrast of digital images. The results obtained according to the RSME and NCC standards have proven that the spline method is the most accurate in the results compared to other statistical methods
— In light of the pandemic that has swept the world, the use of e-learning in educational institutions has become an urgent necessity for continued knowledge communication with students. Educational institutions can benefit from the free tools that Google provide and from these applications, Google classroom which is characterized by ease of use, but the efficiency of using Google classroom is affected by several variables not studied in previous studies Clearly, this study aimed to identify the use of Google classroom as a system for managing e-learning and the factors affecting the performance of students and lecturer. The data of this study were collected from 219 members of the faculty and students at the College of Administra
... Show MoreThis research deals with a shrinking method concernes with the principal components similar to that one which used in the multiple regression “Least Absolute Shrinkage and Selection: LASS”. The goal here is to make an uncorrelated linear combinations from only a subset of explanatory variables that may have a multicollinearity problem instead taking the whole number say, (K) of them. This shrinkage will force some coefficients to equal zero, after making some restriction on them by some "tuning parameter" say, (t) which balances the bias and variance amount from side, and doesn't exceed the acceptable percent explained variance of these components. This had been shown by MSE criterion in the regression case and the percent explained v
... Show MoreThe Dagum Regression Model, introduced to address limitations in traditional econometric models, provides enhanced flexibility for analyzing data characterized by heavy tails and asymmetry, which is common in income and wealth distributions. This paper develops and applies the Dagum model, demonstrating its advantages over other distributions such as the Log-Normal and Gamma distributions. The model's parameters are estimated using Maximum Likelihood Estimation (MLE) and the Method of Moments (MoM). A simulation study evaluates both methods' performance across various sample sizes, showing that MoM tends to offer more robust and precise estimates, particularly in small samples. These findings provide valuable insights into the ana
... Show MoreInferential methods of statistical distributions have reached a high level of interest in recent years. However, in real life, data can follow more than one distribution, and then mixture models must be fitted to such data. One of which is a finite mixture of Rayleigh distribution that is widely used in modelling lifetime data in many fields, such as medicine, agriculture and engineering. In this paper, we proposed a new Bayesian frameworks by assuming conjugate priors for the square of the component parameters. We used this prior distribution in the classical Bayesian, Metropolis-hasting (MH) and Gibbs sampler methods. The performance of these techniques were assessed by conducting data which was generated from two and three-component mixt
... Show MoreSemiparametric methods combined parametric methods and nonparametric methods ,it is important in most of studies which take in it's nature more progress in the procedure of accurate statistical analysis which aim getting estimators efficient, the partial linear regression model is considered the most popular type of semiparametric models, which consisted of parametric component and nonparametric component in order to estimate the parametric component that have certain properties depend on the assumptions concerning the parametric component, where the absence of assumptions, parametric component will have several problems for example multicollinearity means (explanatory variables are interrelated to each other) , To treat this problem we use
... Show MoreAbstract Software-Defined Networking (commonly referred to as SDN) is a newer paradigm that develops the concept of a software-driven network by separating data and control planes. It can handle the traditional network problems. However, this excellent architecture is subjected to various security threats. One of these issues is the distributed denial of service (DDoS) attack, which is difficult to contain in this kind of software-based network. Several security solutions have been proposed recently to secure SDN against DDoS attacks. This paper aims to analyze and discuss machine learning-based systems for SDN security networks from DDoS attack. The results have indicated that the algorithms for machine learning can be used to detect DDoS
... Show MoreRadiation therapy plays an important role in improving breast cancer cases, in order to obtain an appropriateestimate of radiation doses number given to the patient after tumor removal; some methods of nonparametric regression werecompared. The Kernel method was used by Nadaraya-Watson estimator to find the estimation regression function forsmoothing data based on the smoothing parameter h according to the Normal scale method (NSM), Least Squared CrossValidation method (LSCV) and Golden Rate Method (GRM). These methods were compared by simulation for samples ofthree sizes, the method (NSM) proved to be the best according to average of Mean Squares Error criterion and the method(LSCV) proved to be the best according to Average of Mean Absolu
... Show MoreGumbel distribution was dealt with great care by researchers and statisticians. There are traditional methods to estimate two parameters of Gumbel distribution known as Maximum Likelihood, the Method of Moments and recently the method of re-sampling called (Jackknife). However, these methods suffer from some mathematical difficulties in solving them analytically. Accordingly, there are other non-traditional methods, like the principle of the nearest neighbors, used in computer science especially, artificial intelligence algorithms, including the genetic algorithm, the artificial neural network algorithm, and others that may to be classified as meta-heuristic methods. Moreover, this principle of nearest neighbors has useful statistical featu
... Show MoreThis Book is intended to be textbook studied for undergraduate course in multivariate analysis. This book is designed to be used in semester system. In order to achieve the goals of the book, it is divided into the following chapters. Chapter One introduces matrix algebra. Chapter Two devotes to Linear Equation System Solution with quadratic forms, Characteristic roots & vectors. Chapter Three discusses Partitioned Matrices and how to get Inverse, Jacobi and Hessian matrices. Chapter Four deals with Multivariate Normal Distribution (MVN). Chapter Five concern with Joint, Marginal and Conditional Normal Distribution, independency and correlations. Many solved examples are intended in this book, in addition to a variety of unsolved relied pro
... Show More
In this article we study a single stochastic process model for the evaluate the assets pricing and stock.,On of the models le'vy . depending on the so –called Brownian subordinate as it has been depending on the so-called Normal Inverse Gaussian (NIG). this article aims as the estimate that the parameters of his model using my way (MME,MLE) and then employ those estimate of the parameters is the study of stock returns and evaluate asset pricing for both the united Bank and Bank of North which their data were taken from the Iraq stock Exchange.
which showed the results to a preference MLE on MME based on the standard of comparison the average square e
... Show MoreChemical pollution is a very important issue that people suffer from and it often affects the nature of health of society and the future of the health of future generations. Consequently, it must be considered in order to discover suitable models and find descriptions to predict the performance of it in the forthcoming years. Chemical pollution data in Iraq take a great scope and manifold sources and kinds, which brands it as Big Data that need to be studied using novel statistical methods. The research object on using Proposed Nonparametric Procedure NP Method to develop an (OCMT) test procedure to estimate parameters of linear regression model with large size of data (Big Data) which comprises many indicators associated with chemi
... Show Moreفي هذا البحث تم استعمال احد الاساليب التقريبية وهو ما يسمى اسلوب المعدل الموزون للنقاط المقربة (Weighted Average of Rounded Points-WARPing) لتقدير انبعاثات ثاني اوكسيد الكاربون CO_2 في العراق عند استعمال مقدريNadaraya-Watson, Local Linear estimator للفترة الممتدة من عام (1990) ولغاية (2019) ومن ثم المقارنة بين اداء تلك المقدرات وقد اثبتت النتائج تفوق طريقة Local Linear estimator وخاصة في النقاط عند نقاط الحد وتميزها عن طريقة Nadaraya–Watson على الرغم من الاداء ا
... Show More