Regression Discontinuity (RD) means a study that exposes a definite group to the effect of a treatment. The uniqueness of this design lies in classifying the study population into two groups based on a specific threshold limit or regression point, and this point is determined in advance according to the terms of the study and its requirements. Thus , thinking was focused on finding a solution to the issue of workers retirement and trying to propose a scenario to attract the idea of granting an end-of-service reward to fill the gap ( discontinuity point) if it had not been granted. The regression discontinuity method has been used to study and to estimate the effect of the end -service reward on the cutoff of insured workers as well as t
... Show MoreIn this paper has been building a statistical model of the Saudi financial market using GARCH models that take into account Volatility in prices during periods of circulation, were also study the effect of the type of random error distribution of the time series on the accuracy of the statistical model, as it were studied two types of statistical distributions are normal distribution and the T distribution. and found by application of a measured data that the best model for the Saudi market is GARCH (1,1) model when the random error distributed t. student's .
In this research was to use the method of classic dynamic programming (CDP) and the method of fuzzy dynamic programming (FDP) to controlling the inventory in N periods and only one substance ,in order to minimize the total cost and determining the required quantity in warehouse rusafa principal of the ministry of commerce . A comparison was made between the two techniques، We found that the value of fuzzy total cost is less than that the value of classic total cost
In this paper, the error distribution function is estimated for the single index model by the empirical distribution function and the kernel distribution function. Refined minimum average variance estimation (RMAVE) method is used for estimating single index model. We use simulation experiments to compare the two estimation methods for error distribution function with different sample sizes, the results show that the kernel distribution function is better than the empirical distribution function.
Survival analysis is widely applied to data that described by the length of time until the occurrence of an event under interest such as death or other important events. The purpose of this paper is to use the dynamic methodology which provides a flexible method, especially in the analysis of discrete survival time, to estimate the effect of covariate variables through time in the survival analysis on dialysis patients with kidney failure until death occurs. Where the estimations process is completely based on the Bayes approach by using two estimation methods: the maximum A Posterior (MAP) involved with Iteratively Weighted Kalman Filter Smoothing (IWKFS) and in combination with the Expectation Maximization (EM) algorithm. While the other
... Show MoreThe logistic regression model is one of the oldest and most common of the regression models, and it is known as one of the statistical methods used to describe and estimate the relationship between a dependent random variable and explanatory random variables. Several methods are used to estimate this model, including the bootstrap method, which is one of the estimation methods that depend on the principle of sampling with return, and is represented by a sample reshaping that includes (n) of the elements drawn by randomly returning from (N) from the original data, It is a computational method used to determine the measure of accuracy to estimate the statistics, and for this reason, this method was used to find more accurate estimates. The ma
... Show MoreThe purpose of this paper is to model and forecast the white oil during the period (2012-2019) using volatility GARCH-class. After showing that squared returns of white oil have a significant long memory in the volatility, the return series based on fractional GARCH models are estimated and forecasted for the mean and volatility by quasi maximum likelihood QML as a traditional method. While the competition includes machine learning approaches using Support Vector Regression (SVR). Results showed that the best appropriate model among many other models to forecast the volatility, depending on the lowest value of Akaike information criterion and Schwartz information criterion, also the parameters must be significant. In addition, the residuals
... Show MoreTheresearch took the spatial autoregressive model: SAR and spatial error model: SEM in an attempt to provide a practical evident that proves the importance of spatial analysis, with a particular focus on the importance of using regression models spatial andthat includes all of them spatial dependence, which we can test its presence or not by using Moran test. While ignoring this dependency may lead to the loss of important information about the phenomenon under research is reflected in the end on the strength of the statistical estimation power, as these models are the link between the usual regression models with time-series models. Spatial analysis had
... Show Moreان تحليل البقاء هو عبارة عن تحليل البيانات التي تكون في شكل اوقات من اصل الوقت حتى حدوث حدث النهاية ، وفي البحوث الطبية يكون اصل الوقت هو تاريخ تسجيل المفردة او المريض في دراسة ما مثل التجارب السريرية لمقارنة نوعين من الدواء او اكثر اذا كانت نقطة النهاية هي وفاة المريض او اختفاء المفردة فالبيانات الناتجة من هذه العملية تسمى اوقات البقاء اما اذا كانت النهاية هي ليست الوفاة فالبيانات الناتجة تسمى بيانات الوقت ح
... Show MoreIn this research, one of the nonlinear regression models is studied, which is BoxBOD, which is characterized by nonlinear parameters, as the difficulty of this model lies in estimating its parameters for being nonlinear, as its parameters were estimated by some traditional methods, namely the method of non-linear least squares and the greatest possible method and one of the methods of artificial intelligence, it is a genetic algorithm, as this algorithm was based on two types of functions, one of which is the function of the sum of squares of error and the second is the function of possibility. For comparison between the methods used in the research, the comparison scale was based on the average error squares, and for the purpose of data ge
... Show MoreIntended for getting good estimates with more accurate results, we must choose the appropriate method of estimation. Most of the equations in classical methods are linear equations and finding analytical solutions to such equations is very difficult. Some estimators are inefficient because of problems in solving these equations. In this paper, we will estimate the survival function of censored data by using one of the most important artificial intelligence algorithms that is called the genetic algorithm to get optimal estimates for parameters Weibull distribution with two parameters. This leads to optimal estimates of the survival function. The genetic algorithm is employed in the method of moment, the least squares method and the weighted
... Show MoreIn this paper, we designed a new efficient stream cipher cryptosystem that depend on a chaotic map to encrypt (decrypt) different types of digital images. The designed encryption system passed all basic efficiency criteria (like Randomness, MSE, PSNR, Histogram Analysis, and Key Space) that were applied to the key extracted from the random generator as well as to the digital images after completing the encryption process.
Artificial Intelligence Algorithms have been used in recent years in many scientific fields. We suggest employing artificial TABU algorithm to find the best estimate of the semi-parametric regression function with measurement errors in the explanatory variables and the dependent variable, where measurement errors appear frequently in fields such as sport, chemistry, biological sciences, medicine, and epidemiological studies, rather than an exact measurement.
أن المساهمة الأساسية لهذا البحث هي وصف كيفية تحليل الأنظمة الخدمية المعقدة ذات خصائص الطابور الموجودة في مستشفى بغداد التعليمي العام باستخدام تقنيات شبكية وهي تقنية أسلوب (Q – GERT ) وهي اختصار من الكلمات : Queuing theory _ Graphical Evaluation and Review Technique أي أسلوب التقييم والمراجعة البياني حيث سوف يتم معرفة حركة انسيابية المرضى داخل النظام وبعد استخدام هذا المدخل سيتم تمثيل النظام على هيئة مخطط شبكي احتمالي وتحل
... Show MoreNonparametric methods are used in the data that contain outliers values. The main importance in using Nonparametric methods is to locate the median in the multivariate regression model. It is difficult to locate the median due to the presence of more than one dimension and the dispersion of values and the increase of the studied phenomenon data.The genetic algorithms Minimum Weighted Covariance Determinant Estimator (MWCD), was applied and compared with the multilayer neural network Back propagation to find the estimate of the median location based on the minimum distance (Mahalanobis Distance) and smallest specified for the variance matrix. Joint Minimum Covariance Determinant (MCD) as one of the most nonparametric methods robust. The stud
... Show MoreIn this research, a simple experiment in the field of agriculture was studied, in terms of the effect of out-of-control noise as a result of several reasons, including the effect of environmental conditions on the observations of agricultural experiments, through the use of Discrete Wavelet transformation, specifically (The Coiflets transform of wavelength 1 to 2 and the Daubechies transform of wavelength 2 To 3) based on two levels of transform (J-4) and (J-5), and applying the hard threshold rules, soft and non-negative, and comparing the wavelet transformation methods using real data for an experiment with a size of 26 observations. The application was carried out through a program in the language of MATLAB. The researcher concluded that
... Show MoreAccording to the circumstances experienced by our country which led to Occurrence of many crises that are the most important crisis is gaining fuel therefore, the theory of queue (waiting line) had been used to solve this crisis and as the relevance of this issue indirect and essential role in daily life.
في هذا البحث نحاول تسليط الضوء على إحدى طرائق تقدير المعلمات الهيكلية لنماذج المعادلات الآنية الخطية والتي تزودنا بتقديرات متسقة تختلف أحيانا عن تلك التي نحصل عليها من أساليب الطرائق التقليدية الأخرى وفق الصيغة العامة لمقدرات K-CLASS. وهذه الطريقة تعرف بطريقة الإمكان الأعظم محدودة المعلومات" LIML" أو طريقة نسبة التباين الصغرى" LVR" والتي تمثل حسب الصيغة (14.2) الوجه الآخر لطريقة الLIML والتي تشتهر في تقدير معلمات معادل
... Show MoreIn this research the researcher had the concept of uncertainty in terms of types and theories of treatment and measurement as it was taken up are three types of indeterminacy and volatility and inconsistency
This Book is the second edition that intended to be textbook studied for undergraduate/ postgraduate course in mathematical statistics. In order to achieve the goals of the book, it is divided into the following chapters. Chapter One introduces events and probability review. Chapter Two devotes to random variables in their two types: discrete and continuous with definitions of probability mass function, probability density function and cumulative distribution function as well. Chapter Three discusses mathematical expectation with its special types such as: moments, moment generating function and other related topics. Chapter Four deals with some special discrete distributions: (Discrete Uniform, Bernoulli, Binomial, Poisson, Geometric, Neg
... Show MoreGrey system theory is a multidisciplinary scientific approach, which deals with systems that have partially unknown information (small sample and uncertain information). Grey modeling as an important component of such theory gives successful results with limited amount of data. Grey Models are divided into two types; univariate and multivariate grey models. The univariate grey model with one order derivative equation GM (1,1) is the base stone of the theory, it is considered the time series prediction model but it doesn’t take the relative factors in account. The traditional multivariate grey models GM(1,M) takes those factor in account but it has a complex structure and some defects in " modeling mechanism", "parameter estimation "and "m
... Show MoreThe analysis of survival and reliability considered of topics and methods of vital statistics at the present time because of their importance in the various demographical, medical, industrial and engineering fields. This research focused generate random data for samples from the probability distribution Generalized Gamma: GG, known as: "Inverse Transformation" Method: ITM, which includes the distribution cycle integration function incomplete Gamma integration making it more difficult classical estimation so will be the need to illustration to the method of numerical approximation and then appreciation of the function of survival function. It was estimated survival function by simulation the way "Monte Carlo". The Entropy method used for the
... Show MoreThe aim of this research is to use robust technique by trimming, as the analysis of maximum likelihood (ML) often fails in the case of outliers in the studied phenomenon. Where the (MLE) will lose its advantages because of the bad influence caused by the Outliers. In order to address this problem, new statistical methods have been developed so as not to be affected by the outliers. These methods have robustness or resistance. Therefore, maximum trimmed likelihood: (MTL) is a good alternative to achieve more results. Acceptability and analogies, but weights can be used to increase the efficiency of the resulting capacities and to increase the strength of the estimate using the maximum weighted trimmed likelihood (MWTL). In order to perform t
... Show MoreA large number of researchers had attempted to identify the pattern of the functional relationship between fertility from a side and economic and social characteristics of the population from another, with the strength of effect of each. So, this research aims to monitor and analyze changes in the level of fertility temporally and spatially in recent decades, in addition to estimating fertility levels in Iraq for the period (1977-2011) and then make forecasting to the level of fertility in Iraq at the national level (except for the Kurdistan region), and for the period of (2012-2031). To achieve this goal has been the use of the Lee-Carter model to estimate fertility rates and predictable as well. As this is the form often has been familiar
... Show MoreThe need for detection and investigation of the causes of pollution of the marshes and submit a statistical study evaluated accurately and submitted to the competent authorities and to achieve this goal was used to analyze the factorial analysis and then obtained the results from this analysis from a sample selected from marsh water pollutants which they were: (Electrical Conductivity: EC, Power of Hydrogen: PH, Temperature: T, Turbidity: TU, Total Dissolved Solids: TDS, Dissolved Oxygen: DO). The size of sample (44) sites has been withdrawn and examined in the laboratories of the Iraqi Ministry of Environment. By illustrating SPSS program) the results had been obtained. The most important recommendation was to increase the pumping of addit
... Show MoreGeneralized Additive Model has been considered as a multivariate smoother that appeared recently in Nonparametric Regression Analysis. Thus, this research is devoted to study the mixed situation, i.e. for the phenomena that changes its behaviour from linear (with known functional form) represented in parametric part, to nonlinear (with unknown functional form: here, smoothing spline) represented in nonparametric part of the model. Furthermore, we propose robust semiparametric GAM estimator, which compared with two other existed techniques.