In this research , we study the inverse Gompertz distribution (IG) and estimate the survival function of the distribution , and the survival function was evaluated using three methods (the Maximum likelihood, least squares, and percentiles estimators) and choosing the best method estimation ,as it was found that the best method for estimating the survival function is the squares-least method because it has the lowest IMSE and for all sample sizes
This paper is concerned with pre-test single and double stage shrunken estimators for the mean (?) of normal distribution when a prior estimate (?0) of the actule value (?) is available, using specifying shrinkage weight factors ?(?) as well as pre-test region (R). Expressions for the Bias [B(?)], mean squared error [MSE(?)], Efficiency [EFF(?)] and Expected sample size [E(n/?)] of proposed estimators are derived. Numerical results and conclusions are drawn about selection different constants included in these expressions. Comparisons between suggested estimators, with respect to classical estimators in the sense of Bias and Relative Efficiency, are given. Furthermore, comparisons with the earlier existing works are drawn.
Pareto distribution is used in many economic, financial and social applications. This distribution is used for the study of income and wealth and the study of settlement in cities and villages and the study of the sizes of oil wells as well as in the field of communication through the speed of downloading files from the Internet according to their sizes. This distribution is used in mechanical engineering as one of the distributions of models of failure, stress and durability. Given the practical importance of this distribution on the one hand, and the scarcity of sources and statistical research that deal with it, this research touched on some statistical characteristics such as derivation of its mathematical function , probability density
... Show MoreInventory or inventories are stocks of goods being held for future use or sale. The demand for a product in is the number of units that will need to be removed from inventory for use or sale during a specific period. If the demand for future periods can be predicted with considerable precision, it will be reasonable to use an inventory rule that assumes that all predictions will always be completely accurate. This is the case where we say that demand is deterministic.
The timing of an order can be periodic (placing an order every days) or perpetual (placing an order whenever the inventory declines to units).
in this research we discuss how to formulating inv
... Show MoreThe transfer function model the basic concepts in the time series. This model is used in the case of multivariate time series. As for the design of this model, it depends on the available data in the time series and other information in the series so when the representation of the transfer function model depends on the representation of the data In this research, the transfer function has been estimated using the style nonparametric represented in two method local linear regression and cubic smoothing spline method The method of semi-parametric represented use semiparametric single index model, With four proposals, , That the goal of this research is comparing the capabilities of the above mentioned m
... Show MoreResearch Summary
It highlights the importance of assessing the demand for money function in Iraq through the understanding of the relationship between him and affecting the variables by searching the stability of this function and the extent of their influence in the Iraqi dinar exchange rate in order to know the amount of their contribution to the monetary policies of the Iraqi economy fee, as well as through study behavior of the demand for money function in Iraq and analyze the determinants of the demand for money for the period 1991-2013 and the impact of these determinants in the demand for money in Iraq.
And that the problem that we face is how to estimate the total demand for money in
... Show MoreIn this paper, The transfer function model in the time series was estimated using different methods, including parametric Represented by the method of the Conditional Likelihood Function, as well as the use of abilities nonparametric are in two methods local linear regression and cubic smoothing spline method, This research aims to compare those capabilities with the nonlinear transfer function model by using the style of simulation and the study of two models as output variable and one model as input variable in addition t
... Show MoreMarkov chains are an application of stochastic models in operation research, helping the analysis and optimization of processes with random events and transitions. The method that will be deployed to obtain the transient solution to a Markov chain problem is an important part of this process. The present paper introduces a novel Ordinary Differential Equation (ODE) approach to solve the Markov chain problem. The probability distribution of a continuous-time Markov chain with an infinitesimal generator at a given time is considered, which is a resulting solution of the Chapman-Kolmogorov differential equation. This study presents a one-step second-derivative method with better accuracy in solving the first-order Initial Value Problem
... Show MoreAbstract
The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.
... Show More
This paper proposes a new encryption method. It combines two cipher algorithms, i.e., DES and AES, to generate hybrid keys. This combination strengthens the proposed W-method by generating high randomized keys. Two points can represent the reliability of any encryption technique. Firstly, is the key generation; therefore, our approach merges 64 bits of DES with 64 bits of AES to produce 128 bits as a root key for all remaining keys that are 15. This complexity increases the level of the ciphering process. Moreover, it shifts the operation one bit only to the right. Secondly is the nature of the encryption process. It includes two keys and mixes one round of DES with one round of AES to reduce the performance time. The W-method deals with
... Show More