The Estimation Of The Reliability Function Depends On The Accuracy Of The Data Used To Estimate The Parameters Of The Probability distribution, and Because Some Data Suffer from a Skew in their Data to Estimate the Parameters and Calculate the Reliability Function in light of the Presence of Some Skew in the Data, there must be a Distribution that has flexibility in dealing with that Data. As in the data of Diyala Company for Electrical Industries, as it was observed that there was a positive twisting in the data collected from the Power and Machinery Department, which required distribution that deals with those data and searches for methods that accommodate this problem and lead to accurate estimates of the reliability function, The Research Aims to Use The Method Of Moment To Estimate The Reliability Function for Truncated skew-normal Distribution, As This Distribution Represents a Parameterized Distribution That is Characterized By flexibility in dealing with data that is Distributed Normally and Shows some Skewness. From the values defined in the sample space, this means that a cut (Truncated) will be made from the left side in the Skew Normal Distribution and a new Distribution is Derived from the original Skew Distribution that achieves the characteristics of the Skew normal distribution function. Also, real data representing the operating times of three machines until the failure occurred were collected from The Capacity Department of Diyala Company for Electrical Industries, where the results showed that the machines under study have a good reliability index and that the machines can be relied upon at a high rate if they continue to work under the same current working conditions.
In this paper an estimator of reliability function for the pareto dist. Of the first kind has been derived and then a simulation approach by Monte-Calro method was made to compare the Bayers estimator of reliability function and the maximum likelihood estimator for this function. It has been found that the Bayes. estimator was better than maximum likelihood estimator for all sample sizes using Integral mean square error(IMSE).
In this research, we dealt with the study of the Non-Homogeneous Poisson process, which is one of the most important statistical issues that have a role in scientific development as it is related to accidents that occur in reality, which are modeled according to Poisson’s operations, because the occurrence of this accident is related to time, whether with the change of time or its stability. In our research, this clarifies the Non-Homogeneous hemispheric process and the use of one of these models of processes, which is an exponentiated - Weibull model that contains three parameters (α, β, σ) as a function to estimate the time rate of occurrence of earthquakes in Erbil Governorate, as the governorate is adjacent to two countr
... Show MoreA condense study was done to compare between the ordinary estimators. In particular the maximum likelihood estimator and the robust estimator, to estimate the parameters of the mixed model of order one, namely ARMA(1,1) model.
Simulation study was done for a varieties the model. using: small, moderate and large sample sizes, were some new results were obtained. MAPE was used as a statistical criterion for comparison.
Survival analysis is the analysis of data that are in the form of times from the origin of time until the occurrence of the end event, and in medical research, the origin of time is the date of registration of the individual or the patient in a study such as clinical trials to compare two types of medicine or more if the endpoint It is the death of the patient or the disappearance of the individual. The data resulting from this process is called survival times. But if the end is not death, the resulting data is called time data until the event. That is, survival analysis is one of the statistical steps and procedures for analyzing data when the adopted variable is time to event and time. It could be d
... Show MoreIn real situations all observations and measurements are not exact numbers but more or less non-exact, also called fuzzy. So, in this paper, we use approximate non-Bayesian computational methods to estimate inverse Weibull parameters and reliability function with fuzzy data. The maximum likelihood and moment estimations are obtained as non-Bayesian estimation. The maximum likelihood estimators have been derived numerically based on two iterative techniques namely “Newton-Raphson†and the “Expectation-Maximization†techniques. In addition, we provide compared numerically through Monte-Carlo simulation study to obtained estimates of the parameters and reliability function i
... Show MoreThis paper discusses reliability R of the (2+1) Cascade model of inverse Weibull distribution. Reliability is to be found when strength-stress distributed is inverse Weibull random variables with unknown scale parameter and known shape parameter. Six estimation methods (Maximum likelihood, Moment, Least Square, Weighted Least Square, Regression and Percentile) are used to estimate reliability. There is a comparison between six different estimation methods by the simulation study by MATLAB 2016, using two statistical criteria Mean square error and Mean Absolute Percentage Error, where it is found that best estimator between the six estimators is Maximum likelihood estimation method.
In this paper, the maximum likelihood estimates for parameter ( ) of two parameter's Weibull are studied, as well as white estimators and (Bain & Antle) estimators, also Bayes estimator for scale parameter ( ), the simulation procedures are used to find the estimators and comparing between them using MSE. Also the application is done on the data for 20 patients suffering from a headache disease.
Artificial Intelligence Algorithms have been used in recent years in many scientific fields. We suggest employing artificial TABU algorithm to find the best estimate of the semi-parametric regression function with measurement errors in the explanatory variables and the dependent variable, where measurement errors appear frequently in fields such as sport, chemistry, biological sciences, medicine, and epidemiological studies, rather than an exact measurement.
Iraq is facing water shortage problems due to various factors, globally ( Global warming) and regionally ( GAP project) and locally ( improper water resources management projects). In this search the global warming influence on the annual mean value of temperature and yet on the annual mean value of the evapotranspiration for more than three decades has been studied. The climate of Iraq is influenced by its location between the subtropical aridity of the Arabian desert areas and the subtropical humidity of the Arabian Gulf. The relative ascension of temperature degrees in the recent decades was the main factor in relative humidity decrement which increase the evapotranspiration values, since that utilizing a temperature-based method as i
... Show MoreIn this research we been estimated the survival function for data suffer from the disturbances and confusion of Iraq Household Socio-Economic Survey: IHSES II 2012 , to data from a five-year age groups follow the distribution of the Generalized Gamma: GG. It had been used two methods for the purposes of estimating and fitting which is the way the Principle of Maximizing Entropy: POME, and method of booting to nonparametric smoothing function for Kernel, to overcome the mathematical problems plaguing integrals contained in this distribution in particular of the integration of the incomplete gamma function, along with the use of traditional way in which is the Maximum Likelihood: ML. Where the comparison on the basis of the method of the Cen
... Show More