In this paper, we propose a method using continuous wavelets to study the multivariate fractional Brownian motion through the deviations of the transformed random process to find an efficient estimate of Hurst exponent using eigenvalue regression of the covariance matrix. The results of simulations experiments shown that the performance of the proposed estimator was efficient in bias but the variance get increase as signal change from short to long memory the MASE increase relatively. The estimation process was made by calculating the eigenvalues for the variance-covariance matrix of Meyer’s continuous wavelet details coefficients.
In this paper, wavelets were used to study the multivariate fractional Brownian motion through the deviations of the random process to find an efficient estimation of Hurst exponent. The results of simulations experiments were shown that the performance of the proposed estimator was efficient. The estimation process was made by taking advantage of the detail coefficients stationarity from the wavelet transform, as the variance of this coefficient showed the power-low behavior. We use two wavelet filters (Haar and db5) to manage minimizing the mean square error of the model.
The goal of the study is to discover the best model for forecasting the exchange rate of the US dollar against the Iraqi dinar by analyzing time series using the Box Jenkis approach, which is one of the most significant subjects in the statistical sciences employed in the analysis. The exchange rate of the dollar is considered one of the most important determinants of the relative level of the health of the country's economy. It is considered the most watched, analyzed and manipulated measure by the government. There are factors affecting in determining the exchange rate, the most important of which are the amount of money, interest rate and local inflation global balance of payments. The data for the research that represents the exchange r
... Show MoreSurvival analysis is widely applied in data describing for the life time of item until the occurrence of an event of interest such as death or another event of understudy . The purpose of this paper is to use the dynamic approach in the deep learning neural network method, where in this method a dynamic neural network that suits the nature of discrete survival data and time varying effect. This neural network is based on the Levenberg-Marquardt (L-M) algorithm in training, and the method is called Proposed Dynamic Artificial Neural Network (PDANN). Then a comparison was made with another method that depends entirely on the Bayes methodology is called Maximum A Posterior (MAP) method. This method was carried out using numerical algorithms re
... Show Moreلقد لاقت مواضيع استعمال التحليل الاحصاء المالي رواجاً كبيراً في الآونة الأخيرة سواء على مستوى الافراد والشركات العامة والخاصة مروراً بأسواق الاسهم والاوراق المالية (البورصات) وصولاً الى اقتصاديات الدول والبلدان. وذلك بعد وصول الباحثين والدارسين للظواهر الاقتصادية والمالية بكل أنماطها الى إدراك أهمية التحليل الكميّ عموماً والتحليل الاحصائي على وجه الخصوص، مما دفعنا لتأليف الطبعة الأولى من هذا الكتاب بالع
... Show MoreThis book includes three main chapters: 1. Functions & Their Derivatives. 2. Minimum, Maximum and Inflection points. 3. Partial Derivative. In addition to many examples and exercises for the purpose of acquiring the student's ability to think correctly in solving mathematical questions.
In recent years, Bitcoin has become the most widely used blockchain platform in business and finance. The goal of this work is to find a viable prediction model that incorporates and perhaps improves on a combination of available models. Among the techniques utilized in this paper are exponential smoothing, ARIMA, artificial neural networks (ANNs) models, and prediction combination models. The study's most obvious discovery is that artificial intelligence models improve the results of compound prediction models. The second key discovery was that a strong combination forecasting model that responds to the multiple fluctuations that occur in the bitcoin time series and Error improvement should be used. Based on the results, the prediction acc
... Show MoreEstimating multivariate location and scatter with both affine equivariance and positive break down has always been difficult. Awell-known estimator which satisfies both properties is the Minimum volume Ellipsoid Estimator (MVE) Computing the exact (MVE) is often not feasible, so one usually resorts to an approximate Algorithm. In the regression setup, algorithm for positive-break down estimators like Least Median of squares typically recomputed the intercept at each step, to improve the result. This approach is called intercept adjustment. In this paper we show that a similar technique, called location adjustment, Can be applied to the (MVE). For this purpose we use the Minimum Volume Ball (MVB). In order
... Show MoreThe repeated measurement design is called a complete randomized block design for repeated measurement when the subject is given the all different treatments , in this case the subject is considered as a block . Many of nonparametric methods were considered like Friedman test (1937) and Koch test(1969) and Kepner&Robinson test(1988) when the assumption of normal distribution of the data is not satisfied .as well as F test when the assumptions of the analysis of variance is satisfied ,where the observations within blocks are assumed to be equally correlated . The purpose of this paper is to summarize the result of the simulation study for comparing these methods as well as present the suggested
Me
... Show MoreCircular data (circular sightings) are periodic data and are measured on the unit's circle by radian or grades. They are fundamentally different from those linear data compatible with the mathematical representation of the usual linear regression model due to their cyclical nature. Circular data originate in a wide variety of fields of scientific, medical, economic and social life. One of the most important statistical methods that represents this data, and there are several methods of estimating angular regression, including teachers and non-educationalists, so the letter included the use of three models of angular regression, two of which are teaching models and one of which is a model of educators. ) (DM) (MLE) and circular shrinkage mod
... Show MoreThe operation of production planning is a difficult operation and it's required High effect and large time especially it is dynamic activity which it's basic variables change in continuous with the time, for this reason it needs using one of the operation research manner (Dynamic programming) which has a force in the decision making process in the planning and control on the production and its direct affect on the cost of production operation and control on the inventory.
This article aims to explore the importance of estimating the a semiparametric regression function ,where we suggest a new estimator beside the other combined estimators and then we make a comparison among them by using simulation technique . Through the simulation results we find that the suggest estimator is the best with the first and second models ,wherealse for the third model we find Burman and Chaudhuri (B&C) is best.
In this research want to make analysis for some indicators and it's classifications that related with the teaching process and the scientific level for graduate studies in the university by using analysis of variance for ranked data for repeated measurements instead of the ordinary analysis of variance . We reach many conclusions for the
important classifications for each indicator that has affected on the teaching process. &nb
... Show MoreIn this paper, the fuzzy logic and the trapezoidal fuzzy intuitionistic number were presented, as well as some properties of the trapezoidal fuzzy intuitionistic number and semi- parametric logistic regression model when using the trapezoidal fuzzy intuitionistic number. The output variable represents the dependent variable sometimes cannot be determined in only two cases (response, non-response)or (success, failure) and more than two responses, especially in medical studies; therefore so, use a semi parametric logistic regression model with the output variable (dependent variable) representing a trapezoidal fuzzy intuitionistic number.
the model was estimated on simulati
... Show MoreIn this research, the covariance estimates were used to estimate the population mean in the stratified random sampling and combined regression estimates. were compared by employing the robust variance-covariance matrices estimates with combined regression estimates by employing the traditional variance-covariance matrices estimates when estimating the regression parameter, through the two efficiency criteria (RE) and mean squared error (MSE). We found that robust estimates significantly improved the quality of combined regression estimates by reducing the effect of outliers using robust covariance and covariance matrices estimates (MCD, MVE) when estimating the regression parameter. In addition, the results of the simulation study proved
... Show MoreThe main problem when dealing with fuzzy data variables is that it cannot be formed by a model that represents the data through the method of Fuzzy Least Squares Estimator (FLSE) which gives false estimates of the invalidity of the method in the case of the existence of the problem of multicollinearity. To overcome this problem, the Fuzzy Bridge Regression Estimator (FBRE) Method was relied upon to estimate a fuzzy linear regression model by triangular fuzzy numbers. Moreover, the detection of the problem of multicollinearity in the fuzzy data can be done by using Variance Inflation Factor when the inputs variable of the model crisp, output variable, and parameters are fuzzed. The results were compared usin
... Show Moren this research, several estimators concerning the estimation are introduced. These estimators are closely related to the hazard function by using one of the nonparametric methods namely the kernel function for censored data type with varying bandwidth and kernel boundary. Two types of bandwidth are used: local bandwidth and global bandwidth. Moreover, four types of boundary kernel are used namely: Rectangle, Epanechnikov, Biquadratic and Triquadratic and the proposed function was employed with all kernel functions. Two different simulation techniques are also used for two experiments to compare these estimators. In most of the cases, the results have proved that the local bandwidth is the best for all the types of the kernel boundary func
... Show MoreIn this paper, we build a fuzzy classification system for classifying the nutritional status of children under 5 years old in Iraq using the Mamdani method based on input variables such as weight and height to determine the nutritional status of the child. Also, Classifying the nutritional status faces a difficult challenge in the medical field due to uncertainty and ambiguity in the variables and attributes that determine the categories of nutritional status for children, which are relied upon in medical diagnosis to determine the types of malnutrition problems and identify the categories or groups suffering from malnutrition to determine the risks faced by each group or category of children. Malnutrition in children is one of the most
... Show MoreIn this article we study the variance estimator for the normal distribution when the mean is un known depend of the cumulative function between unbiased estimator and Bays estimator for the variance of normal distribution which is used include Double Stage Shrunken estimator to obtain higher efficiency for the variance estimator of normal distribution when the mean is unknown by using small volume equal volume of two sample .
The analysis of survival and reliability considered of topics and methods of vital statistics at the present time because of their importance in the various demographical, medical, industrial and engineering fields. This research focused generate random data for samples from the probability distribution Generalized Gamma: GG, known as: "Inverse Transformation" Method: ITM, which includes the distribution cycle integration function incomplete Gamma integration making it more difficult classical estimation so will be the need to illustration to the method of numerical approximation and then appreciation of the function of survival function. It was estimated survival function by simulation the way "Monte Carlo". The Entropy method used for the
... Show MoreIn this research, we investigated addressing the challenges associated with the seasonal allergic medical drug inventory system. The focus was on determining the optimal demand by calculating the Economic Order Quantity (EOQ) and achieving the lowest cost within the Pharmaceutical Industries and Medical Supplies company in Samarra. The primary objective was to efficiently meet the demand for seasonal allergy medications by identifying the optimal demand for medical drugs. The study encompassed two types of seasonal allergy medications, namely Samatifen drink and VALIAPAM 2 pills. The calculation of the lowest cost involved two methods: the multi-component production model without deficit, with restrictions, the solution was done using th
... Show MoreThis article aim to estimate the Return Stock Rate of the private banking sector, with two banks, by adopting a Partial Linear Model based on the Arbitrage Pricing Model (APT) theory, using Wavelet and Kernel Smoothers. The results have proved that the wavelet method is the best. Also, the results of the market portfolio impact and inflation rate have proved an adversely effectiveness on the rate of return, and direct impact of the money supply.
The logistic regression model is one of the oldest and most common of the regression models, and it is known as one of the statistical methods used to describe and estimate the relationship between a dependent random variable and explanatory random variables. Several methods are used to estimate this model, including the bootstrap method, which is one of the estimation methods that depend on the principle of sampling with return, and is represented by a sample reshaping that includes (n) of the elements drawn by randomly returning from (N) from the original data, It is a computational method used to determine the measure of accuracy to estimate the statistics, and for this reason, this method was used to find more accurate estimates. The ma
... Show MoreThis Book is intended to be a textbook studied for undergraduate course in financial statistics/ department of Financial Sciences and Banking. This book is designed to be used in semester system. To achieve the goals of the book, it is divided into the following chapters. Chapter one introduces basic concepts. Chapter two devotes to frequency distribution and data representation. Chapter three discusses central tendency measures (all types of means, mode, and median). Chapter four deals with dispersion Measures (standard deviation, variance, and coefficient of variation). Chapter five concerned with correlation and regression analysis. While chapter six concerned with testing Hypotheses (One population mean test, Two "independent" populati
... Show Moreالمستخلص يهدف هذا البحث الى تجاوز مشكلة البعدية من خلال طرائق الانحدار اللامعلمي والتي تعمل على تقليل جذر متوسط الخطأ التربيعي (RMSE) , أذ تم استعمال طريقة انحدار الاسقاطات المتلاحقة (PPR) ,والتي تعتبر احدى طرائق اختزال الابعاد التي تعمل على تجاوز مشكلة البعدية (curse of dimensionality) , وان طريقة (PPR) من التقنيات الاحصائية التي تهتم بأيجاد الاسقاطات الاكثر أهمية في البيانات المتعددة الابعاد , ومع ايجاد كل اسقاط
... Show MoreIn this paper, the effect of changes in bank deposits on the money supply in Iraq was studied by estimating the error correction model (ECM) for monthly time series data for the period (2010-2015) . The Philips Perron was used to test the stationarity and also we used Engle and Granger to test the cointegration . we used cubic spline and local polynomial estimator to estimate regression function .The result show that local polynomial was better than cubic spline with the first level of cointegration.