Entropy define as uncertainty measure has been transfared by using the cumulative distribution function and reliability function for the Burr type – xii. In the case of data which suffer from volatility to build a model the probability distribution on every failure of a sample after achieving limitations function, probabilistic distribution. Has been derived formula probability distribution of the new transfer application entropy on the probability distribution of continuous Burr Type-XII and tested a new function and found that it achieved the conditions function probability, been derived mean and function probabilistic aggregate in order to be approved in the generation of data for the purpose of implementation of simulation experiments. Was then estimate parameters of the probability distribution that has been extracted from the distribution formula for the function of every failure using a method as possible the greatest and the way White and the way the estimated mixed, and comparison between the adoption of the standard average squares error (MSE) to compare the results using the method of simulation in the demo to get to the advantage estimators and volumes of different samples to my teacher and measurement form of distribution. The results reveal that the mixed estimated parameter is the best form either parameter shape, and the results showed that the best estimated of scale parameters are the White estimator
This paper deals with defining Burr-XII, and how to obtain its p.d.f., and CDF, since this distribution is one of failure distribution which is compound distribution from two failure models which are Gamma model and weibull model. Some equipment may have many important parts and the probability distributions representing which may be of different types, so found that Burr by its different compound formulas is the best model to be studied, and estimated its parameter to compute the mean time to failure rate. Here Burr-XII rather than other models is consider because it is used to model a wide variety of phenomena including crop prices, household income, option market price distributions, risk and travel time. It has two shape-parame
... Show MoreThe origin of this technique lies in the analysis of François Kenai (1694-1774), the leader of the School of Naturalists, presented in Tableau Economique. This method was developed by Karl Marx in his analysis of the Departmental Relationships and the nature of these relations in the models of " "He said. The current picture of this type of economic analysis is credited to the Russian economist Vasily Leontif. This analytical model is commonly used in developing economic plans in developing countries (p. 1, p. 86). There are several types of input and output models, such as static model, mobile model, regional models, and so on. However, this research will be confined to the open-ended model, which found areas in practical application.
... Show MoreThis deals with estimation of Reliability function and one shape parameter (?) of two- parameters Burr – XII , when ?(shape parameter is known) (?=0.5,1,1.5) and also the initial values of (?=1), while different sample shze n= 10, 20, 30, 50) bare used. The results depend on empirical study through simulation experiments are applied to compare the four methods of estimation, as well as computing the reliability function . The results of Mean square error indicates that Jacknif estimator is better than other three estimators , for all sample size and parameter values
The theory of probabilistic programming may be conceived in several different ways. As a method of programming it analyses the implications of probabilistic variations in the parameter space of linear or nonlinear programming model. The generating mechanism of such probabilistic variations in the economic models may be due to incomplete information about changes in demand, production and technology, specification errors about the econometric relations presumed for different economic agents, uncertainty of various sorts and the consequences of imperfect aggregation or disaggregating of economic variables. In this Research we discuss the probabilistic programming problem when the coefficient bi is random variable
... Show MoreInventory or inventories are stocks of goods being held for future use or sale. The demand for a product in is the number of units that will need to be removed from inventory for use or sale during a specific period. If the demand for future periods can be predicted with considerable precision, it will be reasonable to use an inventory rule that assumes that all predictions will always be completely accurate. This is the case where we say that demand is deterministic.
The timing of an order can be periodic (placing an order every days) or perpetual (placing an order whenever the inventory declines to units).
in this research we discuss how to formulating inv
... Show MoreAbstract
In this research we been estimated the survival function for data suffer from the disturbances and confusion of Iraq Household Socio-Economic Survey: IHSES II 2012 , to data from a five-year age groups follow the distribution of the Generalized Gamma: GG. It had been used two methods for the purposes of estimating and fitting which is the way the Principle of Maximizing Entropy: POME, and method of booting to nonparametric smoothing function for Kernel, to overcome the mathematical problems plaguing integrals contained in this distribution in particular of the integration of the incomplete gamma function, along with the use of traditional way in which is the Maximum Likelihood: ML. Where the comparison on t
... Show MoreIn order to achieve overall balance in the economy to be achieved in different markets and at one time (market commodity, monetary and labor market and the balance of payments and public budget), did not provide yet a model from which to determine the overall balance in the economy and the difficulty of finding the inter-relationship between all these markets and put them applied in the form of allowing the identification of balance in all markets at once.
One of the best models that have dealt with this subject is a model
(LM-BP-IS), who teaches balance in the commodity market and money market and balance of payments and the importance of this issue This research tries to shed light on the reality
In this research we been estimated the survival function for data suffer from the disturbances and confusion of Iraq Household Socio-Economic Survey: IHSES II 2012 , to data from a five-year age groups follow the distribution of the Generalized Gamma: GG. It had been used two methods for the purposes of estimating and fitting which is the way the Principle of Maximizing Entropy: POME, and method of booting to nonparametric smoothing function for Kernel, to overcome the mathematical problems plaguing integrals contained in this distribution in particular of the integration of the incomplete gamma function, along with the use of traditional way in which is the Maximum Likelihood: ML. Where the comparison on the basis of the method of the Cen
... Show MoreThe internet, unlike other traditional means of communication, has a flexibility to stimulate the user and allows him to develop it. Perhaps, the reason for the superiority of the internet over other traditional means of communication is the possibility of change and transmission from one stage to another in a short period. This means that the internet is able to move from the use to the development of the use and then the development of means and innovation as the innovation of the internet is a logical product of the interaction of the user with the network. The internet invests all the proposals and ideas and does not ignore any even if it is simple. This is represented in social networking sites which in fact reflects personal emotio
... Show MoreVideo steganography has become a popular option for protecting secret data from hacking attempts and common attacks on the internet. However, when the whole video frame(s) are used to embed secret data, this may lead to visual distortion. This work is an attempt to hide sensitive secret image inside the moving objects in a video based on separating the object from the background of the frame, selecting and arranging them according to object's size for embedding secret image. The XOR technique is used with reverse bits between the secret image bits and the detected moving object bits for embedding. The proposed method provides more security and imperceptibility as the moving objects are used for embedding, so it is difficult to notice the
... Show More