The research presents the reliability. It is defined as the probability of accomplishing any part of the system within a specified time and under the same circumstances. On the theoretical side, the reliability, the reliability function, and the cumulative function of failure are studied within the one-parameter Raleigh distribution. This research aims to discover many factors that are missed the reliability evaluation which causes constant interruptions of the machines in addition to the problems of data. The problem of the research is that there are many methods for estimating the reliability function but no one has suitable qualifications for most of these methods in the data such as the presence of anomalous values or extreme values or the appropriate distribution of these data is unknown. Therefore, the data need methods through which can be dealt with this problem. Two of the estimation methods have been used: the robust (estimator M) method and the nonparametric Kernel method. These estimation methods are derived to arrive at the formulas of their capabilities. A comparison of these estimations is made using the simulation method as it is implemented. Simulation experiments using different sample sizes and each experiment is repeated (1000) times to achieve the objective. The results are compared by using one of the most important statistical measures which is the mean of error squares (MSE). The best estimation method has been reached is the robust (M estimator) method. It has been shown that the estimation of the reliability function gradually decreases with time, and this is identical to the properties of this function.
Linear regression is one of the most important statistical tools through which it is possible to know the relationship between the response variable and one variable (or more) of the independent variable(s), which is often used in various fields of science. Heteroscedastic is one of the linear regression problems, the effect of which leads to inaccurate conclusions. The problem of heteroscedastic may be accompanied by the presence of extreme outliers in the independent variables (High leverage points) (HLPs), the presence of (HLPs) in the data set result unrealistic estimates and misleading inferences. In this paper, we review some of the robust
... Show MoreThe present research aims to test the effect of cognitive complexity as an independent variable in organizational agility as a responsive variable among the leaders working at the headquarters of the Iraqi Petroleum Products Distribution Company.
To conclude a number of recommendations that contribute in the organizational agility in the company, and due to the importance of this research in public organizations and its notable role in community organizations. The research was carried out on a random sample of 101 individuals out of a total of 308, which represents the high leaders in the company (general managers, head of departments, and division officials). A questionnaire was used as information
... Show MoreInformation security in data storage and transmission is increasingly important. On the other hand, images are used in many procedures. Therefore, preventing unauthorized access to image data is crucial by encrypting images to protect sensitive data or privacy. The methods and algorithms for masking or encoding images vary from simple spatial-domain methods to frequency-domain methods, which are the most complex and reliable. In this paper, a new cryptographic system based on the random key generator hybridization methodology by taking advantage of the properties of Discrete Cosine Transform (DCT) to generate an indefinite set of random keys and taking advantage of the low-frequency region coefficients after the DCT stage to pass them to
... Show MoreIt is often needed in demographic research to modern statistical tools are flexible and convenient to keep up with the type of data available in Iraq in terms of the passage of the country far from periods of war and economic sanctions and instability of the security for a period of time . So, This research aims to propose the use of style nonparametric splines as a substitute for some of the compounds of analysis within the model Lee-Carter your appreciation rate for fertility detailed variable response in Iraq than the period (1977 - 2011) , and then predict for the period (2012-2031). This goal was achieved using a style nonparametric decomposition of singular value vehicles using the main deltoid , and then estimate the effect of time-s
... Show MoreThe reason for applying the project is for the development of some physiological variables and skill performance using a device for the first time applied in Iraq to a sample of badminton players, and this device has a patent on 7/2/2019. An experimental method is used for a sample of (12) players from (Al-Orthodoxy club). The experimental group is applied the proposal technique, while, the control group is instructed by the coach.The results of the research attribute to the role of the proposal exercises of fit light technology, thus, this technology has increased the attention and focusing of the sample and some physiological variables and smash shot skill. It has been concluded that the exercises using fit light technology is helped to i
... Show MoreThis paper aims to evaluate the reliability analysis for steel beam which represented by the probability of Failure and reliability index. Monte Carlo Simulation Method (MCSM) and First Order Reliability Method (FORM) will be used to achieve this issue. These methods need two samples for each behavior that want to study; the first sample for resistance (carrying capacity R), and second for load effect (Q) which are parameters for a limit state function. Monte Carlo method has been adopted to generate these samples dependent on the randomness and uncertainties in variables. The variables that consider are beam cross-section dimensions, material property, beam length, yield stress, and applied loads. Matlab software has be
... Show MoreThe Dagum Regression Model, introduced to address limitations in traditional econometric models, provides enhanced flexibility for analyzing data characterized by heavy tails and asymmetry, which is common in income and wealth distributions. This paper develops and applies the Dagum model, demonstrating its advantages over other distributions such as the Log-Normal and Gamma distributions. The model's parameters are estimated using Maximum Likelihood Estimation (MLE) and the Method of Moments (MoM). A simulation study evaluates both methods' performance across various sample sizes, showing that MoM tends to offer more robust and precise estimates, particularly in small samples. These findings provide valuable insights into the ana
... Show MoreThe interests toward developing accurate automatic face emotion recognition methodologies are growing vastly, and it is still one of an ever growing research field in the region of computer vision, artificial intelligent and automation. However, there is a challenge to build an automated system which equals human ability to recognize facial emotion because of the lack of an effective facial feature descriptor and the difficulty of choosing proper classification method. In this paper, a geometric based feature vector has been proposed. For the classification purpose, three different types of classification methods are tested: statistical, artificial neural network (NN) and Support Vector Machine (SVM). A modified K-Means clustering algorithm
... Show MoreIn this article, we will present a quasi-contraction mapping approach for D iteration, and we will prove that this iteration with modified SP iteration has the same convergence rate. At the other hand, we prove that the D iteration approach for quasi-contraction maps is faster than certain current leading iteration methods such as, Mann and Ishikawa. We are giving a numerical example, too.