The Estimation Of The Reliability Function Depends On The Accuracy Of The Data Used To Estimate The Parameters Of The Probability distribution, and Because Some Data Suffer from a Skew in their Data to Estimate the Parameters and Calculate the Reliability Function in light of the Presence of Some Skew in the Data, there must be a Distribution that has flexibility in dealing with that Data. As in the data of Diyala Company for Electrical Industries, as it was observed that there was a positive twisting in the data collected from the Power and Machinery Department, which required distribution that deals with those data and searches for methods that accommodate this problem and lead to accurate estimates of the reliability function, The Research Aims to Use The Method Of Moment To Estimate The Reliability Function for Truncated skew-normal Distribution, As This Distribution Represents a Parameterized Distribution That is Characterized By flexibility in dealing with data that is Distributed Normally and Shows some Skewness. From the values defined in the sample space, this means that a cut (Truncated) will be made from the left side in the Skew Normal Distribution and a new Distribution is Derived from the original Skew Distribution that achieves the characteristics of the Skew normal distribution function. Also, real data representing the operating times of three machines until the failure occurred were collected from The Capacity Department of Diyala Company for Electrical Industries, where the results showed that the machines under study have a good reliability index and that the machines can be relied upon at a high rate if they continue to work under the same current working conditions.
The research aims at the possibility of measuring the technical and scale efficiency (SE) of the departments of the College of Administration and Economics at the University of Baghdad for a period lasting 8 years, from the academic year 2013-2014 to 2018-2019 using the method of Applied Data Analysis with an input and output orientation to maintain the distinguished competitive position and try to identify weaknesses in performance and address them. Nevertheless, the research problem lies in diagnosing the most acceptable specializations in the labor market and determining the reasons for students’ reluctance to enter some departments. Furthermore, the (Win4DEAp) program was used to measure technical and scale efficiency (SE) and rely on
... Show MoreThis study dealt with the management strategy as an independent variable and the integrated industrial distribution as a variable. The study aimed at finding the integrated industrial distribution that fits with the management strategy in providing the needs of the firm on the one hand and reducing the cost of management that is reflected in increasing its profits.
The researcher selected the data from (130) decision makers in the corporation and used the questionnaire as a tool for collecting data and used a set of statistical tools and tools suitable for the nature of information and were processed using the data analysis system (SPSS version 24) Based on the analysis of the responses of the sample and the test of correlation and
The problem of Multicollinearity is one of the most common problems, which deal to a large extent with the internal correlation between explanatory variables. This problem is especially Appear in economics and applied research, The problem of Multicollinearity has a negative effect on the regression model, such as oversized variance degree and estimation of parameters that are unstable when we use the Least Square Method ( OLS), Therefore, other methods were used to estimate the parameters of the negative binomial model, including the estimated Ridge Regression Method and the Liu type estimator, The negative binomial regression model is a nonline
... Show MoreIt is so much noticeable that initialization of architectural parameters has a great impact on whole learnability stream so that knowing mathematical properties of dataset results in providing neural network architecture a better expressivity and capacity. In this paper, five random samples of the Volve field dataset were taken. Then a training set was specified and the persistent homology of the dataset was calculated to show impact of data complexity on selection of multilayer perceptron regressor (MLPR) architecture. By using the proposed method that provides a well-rounded strategy to compute data complexity. Our method is a compound algorithm composed of the t-SNE method, alpha-complexity algorithm, and a persistence barcod
... Show MoreIt is often needed in demographic research to modern statistical tools are flexible and convenient to keep up with the type of data available in Iraq in terms of the passage of the country far from periods of war and economic sanctions and instability of the security for a period of time . So, This research aims to propose the use of style nonparametric splines as a substitute for some of the compounds of analysis within the model Lee-Carter your appreciation rate for fertility detailed variable response in Iraq than the period (1977 - 2011) , and then predict for the period (2012-2031). This goal was achieved using a style nonparametric decomposition of singular value vehicles using the main deltoid , and then estimate the effect of time-s
... Show More
Abstract
Due to the lack of previous statistical study of the behavior of payments, specifically health insurance, which represents the largest proportion of payments in the general insurance companies in Iraq, this study was selected and applied in the Iraqi insurance company.
In order to find the convenient model representing the health insurance payments, we initially detected two probability models by using (Easy Fit) software:
First, a single Lognormal for the whole sample and the other is a Compound Weibull for the two Sub samples (small payments and large payments), and we focused on the compoun
... Show MoreThe aim of this study is to propose reliable equations to estimate the in-situ concrete compressive strength from the non-destructive test. Three equations were proposed: the first equation considers the number of rebound hummer only, the second equation consider the ultrasonic pulse velocity only, and the third equation combines the number of rebound hummer and the ultrasonic pulse velocity. The proposed equations were derived from non-linear regression analysis and they were calibrated with the test results of 372 concrete specimens compiled from the literature. The performance of the proposed equations was tested by comparing their strength estimations with those of related existing equations from literature. Comparis
... Show MoreThe depth of causative source of gravity is one of the most important parameter
of gravity investigation. Present study introduces the theoretical solve of the
intersection point of the horizontal and vertical gradients of gravity anomaly. Two
constants are obtained to estimate the depth of causative source of gravity anomaly,
first one is 1.7807 for spherical body and the second is 2.4142 for the horizontal
cylinder body. These constants are tested for estimating the depth of three actual
cases and good results are obtained. It is believed that the constants derived on
theoretical bases are better than those obtained by empirical experimental studies.