The Estimation Of The Reliability Function Depends On The Accuracy Of The Data Used To Estimate The Parameters Of The Probability distribution, and Because Some Data Suffer from a Skew in their Data to Estimate the Parameters and Calculate the Reliability Function in light of the Presence of Some Skew in the Data, there must be a Distribution that has flexibility in dealing with that Data. As in the data of Diyala Company for Electrical Industries, as it was observed that there was a positive twisting in the data collected from the Power and Machinery Department, which required distribution that deals with those data and searches for methods that accommodate this problem and lead to accurate estimates of the reliability function, The Research Aims to Use The Method Of Moment To Estimate The Reliability Function for Truncated skew-normal Distribution, As This Distribution Represents a Parameterized Distribution That is Characterized By flexibility in dealing with data that is Distributed Normally and Shows some Skewness. From the values defined in the sample space, this means that a cut (Truncated) will be made from the left side in the Skew Normal Distribution and a new Distribution is Derived from the original Skew Distribution that achieves the characteristics of the Skew normal distribution function. Also, real data representing the operating times of three machines until the failure occurred were collected from The Capacity Department of Diyala Company for Electrical Industries, where the results showed that the machines under study have a good reliability index and that the machines can be relied upon at a high rate if they continue to work under the same current working conditions.
Each phenomenon contains several variables. Studying these variables, we find mathematical formula to get the joint distribution and the copula that are a useful and good tool to find the amount of correlation, where the survival function was used to measure the relationship of age with the level of cretonne in the remaining blood of the person. The Spss program was also used to extract the influencing variables from a group of variables using factor analysis and then using the Clayton copula function that is used to find the shared binary distributions using multivariate distributions, where the bivariate distribution was calculated, and then the survival function value was calculated for a sample size (50) drawn from Yarmouk Ho
... Show MoreAbstract The wavelet shrink estimator is an attractive technique when estimating the nonparametric regression functions, but it is very sensitive in the case of a correlation in errors. In this research, a polynomial model of low degree was used for the purpose of addressing the boundary problem in the wavelet reduction in addition to using flexible threshold values in the case of Correlation in errors as it deals with those transactions at each level separately, unlike the comprehensive threshold values that deal with all levels simultaneously, as (Visushrink) methods, (False Discovery Rate) method, (Improvement Thresholding) and (Sureshrink method), as the study was conducted on real monthly data represented in the rates of theft crimes f
... Show MoreA reliability system of the multi-component stress-strength model R(s,k) will be considered in the present paper ,when the stress and strength are independent and non-identically distribution have the Exponentiated Family Distribution(FED) with the unknown shape parameter α and known scale parameter λ equal to two and parameter θ equal to three. Different estimation methods of R(s,k) were introduced corresponding to Maximum likelihood and Shrinkage estimators. Comparisons among the suggested estimators were prepared depending on simulation established on mean squared error (MSE) criteria.
The accuracy of the Moment Method for imposing no-slip boundary conditions in the lattice Boltzmann algorithm is investigated numerically using lid-driven cavity flow. Boundary conditions are imposed directly upon the hydrodynamic moments of the lattice Boltzmann equations, rather than the distribution functions, to ensure the constraints are satisfied precisely at grid points. Both single and multiple relaxation time models are applied. The results are in excellent agreement with data obtained from state-of-the-art numerical methods and are shown to converge with second order accuracy in grid spacing.
Abstract:-
The approach maintenance and replacement one of techniques of operations research whom cares of the failure experienced by a lot of production lines which consist of a set of machines and equipment, which in turn exposed to the failure or work stoppages over the lifetime, which requires reducing the working time of these machines or equipment below what can or conuct maintenance process once in a while or a replacement for one part of the machine or replace one of the machines in production lines. In this research is the study of the failure s that occur in some parts of one of the machines for the General Company for Vege
... Show MoreThe Log-Logistic distribution is one of the important statistical distributions as it can be applied in many fields and biological experiments and other experiments, and its importance comes from the importance of determining the survival function of those experiments. The research will be summarized in making a comparison between the method of maximum likelihood and the method of least squares and the method of weighted least squares to estimate the parameters and survival function of the log-logistic distribution using the comparison criteria MSE, MAPE, IMSE, and this research was applied to real data for breast cancer patients. The results showed that the method of Maximum likelihood best in the case of estimating the paramete
... Show MoreThe purpose of this article was to identify and assess the importance of risk factors in the tendering phase of construction projects. The construction project cannot succeed without the identification and categorization of these risk elements. In this article, a questionnaire for likelihood and impact was designed and distributed to a panel of specialists to analyze risk factors. The risk matrix was also used to research, explore, and identify the risks that influence the tendering phase of construction projects. The probability and impact values assigned to risk are used to calculate the risk's score. A risk matrix is created by combining probability and impact criteria. To determine the main risk elements for the tend
... Show MoreThe purpose of this article was to identify and assess the importance of risk factors in the tendering phase of construction projects. The construction project cannot succeed without the identification and categorization of these risk elements. In this article, a questionnaire for likelihood and impact was designed and distributed to a panel of specialists to analyze risk factors. The risk matrix was also used to research, explore, and identify the risks that influence the tendering phase of construction projects. The probability and impact values assigned to risk are used to calculate the risk's score. A risk matrix is created by combining probability and impact criteria. To determine the main risk elements for the tender phase of
... Show MoreGround-based active optical sensors (GBAOS) have been successfully used in agriculture to predict crop yield potential (YP) early in the season and to improvise N rates for optimal crop yield. However, the models were found weak or inconsistent due to environmental variation especially rainfall. The objectives of the study were to evaluate if GBAOS could predict YP across multiple locations, soil types, cultivation systems, and rainfall differences. This study was carried from 2011 to 2013 on corn (Zea mays L.) in North Dakota, and in 2017 in potatoes in Maine. Six N rates were used on 50 sites in North Dakota and 12 N rates on two sites, one dryland and one irrigated, in Maine. Two active GBAOS used for this study were GreenSeeker and Holl
... Show More