In this paper was discussed the process of compounding two distributions using new compounding procedure which is connect a number of life time distributions ( continuous distribution ) where is the number of these distributions represent random variable distributed according to one of the discrete random distributions . Based on this procedure have been compounding zero – truncated poisson distribution with weibell distribution to produce new life time distribution having three parameter , Advantage of that failure rate function having many cases ( increasing , dicreasing , unimodal , bathtube) , and study the resulting distribution properties such as : expectation , variance , comulative function , reliability function and failure rate function . In addition to estimating the parameters of the resulting distribution by using three methods of estimation are maximum likelihood method ,minmum chi square method using Downhill simplex algorithm , percentile method. The comparison between them was depending on the statistical measure mean square error ( MSE ) by implementing simulation experiment using different samples size ( small , large , medium ) , which through their results was reached that minmum chi square method using Downhill simplex algorithm is the best to estimating the parameter and probability function for compound distribution .
The issue of penalized regression model has received considerable critical attention to variable selection. It plays an essential role in dealing with high dimensional data. Arctangent denoted by the Atan penalty has been used in both estimation and variable selection as an efficient method recently. However, the Atan penalty is very sensitive to outliers in response to variables or heavy-tailed error distribution. While the least absolute deviation is a good method to get robustness in regression estimation. The specific objective of this research is to propose a robust Atan estimator from combining these two ideas at once. Simulation experiments and real data applications show that the p
... Show MoreThe research studied and analyzed the hybrid parallel-series systems of asymmetrical components by applying different experiments of simulations used to estimate the reliability function of those systems through the use of the maximum likelihood method as well as the Bayes standard method via both symmetrical and asymmetrical loss functions following Rayleigh distribution and Informative Prior distribution. The simulation experiments included different sizes of samples and default parameters which were then compared with one another depending on Square Error averages. Following that was the application of Bayes standard method by the Entropy Loss function that proved successful throughout the experimental side in finding the reliability fun
... Show MoreThe question about the existence of correlation between the parameters A and m of the Paris function is re-examined theoretically for brittle material such as alumina ceramic (Al2O3) with different grain size. Investigation about existence of the exponential function which fit a good approximation to the majority of experimental data of crack velocity versus stress intensity factor diagram. The rate theory of crack growth was applied for data of alumina ceramics samples in region I and making use of the values of the exponential function parameters the crack growth rate theory parameters were estimated.
Objective This research investigates Breast Cancer real data for Iraqi women, these data are acquired manually from several Iraqi Hospitals of early detection for Breast Cancer. Data mining techniques are used to discover the hidden knowledge, unexpected patterns, and new rules from the dataset, which implies a large number of attributes. Methods Data mining techniques manipulate the redundant or simply irrelevant attributes to discover interesting patterns. However, the dataset is processed via Weka (The Waikato Environment for Knowledge Analysis) platform. The OneR technique is used as a machine learning classifier to evaluate the attribute worthy according to the class value. Results The evaluation is performed using
... Show MoreA roundabout is a highway engineering concept meant to calm traffic, increase safety, reduce stop-and-go travel, reduce accidents and congestion, and decrease traffic delays. It is circular and facilitates one-way traffic flow around a central point. The first part of this study evaluated the principles and methods used to compare the capacity methods of roundabouts with different traffic conditions and geometric configurations. These methods include gap acceptance, empirical, and simulation software methods. Previous studies mentioned in this research used various methods and other new models developed by several researchers. However, this paper's main aim is to compare different roundabout capacity models for acceptabl
... Show MoreIt is generally accepted that there are two spectrophotometric techniques for quantifying ceftazidime (CFT) in bulk medications and pharmaceutical formulations. The methods are described as simple, sensitive, selective, accurate and efficient techniques. The first method used an alkaline medium to convert ceftazidime to its diazonium salt, which is then combined with the 1-Naphthol (1-NPT) and 2-Naphthol (2-NPT) reagents. The azo dye that was produced brown and red in color with absorption intensities of ƛmax 585 and 545nm respectively. Beer's law was followed in terms of concentration ranging from (3-40) µg .ml-1 For (CFT-1-NPT) and (CFT-2-NPT), the detection limits were 1.0096 and 0.8017 µg.ml-1, respec
... Show MoreThe challenge to incorporate usability evaluation values and practices into agile development process is not only persisting but also systemic. Notable contributions of researchers have attempted to isolate and close the gaps between both fields, with the aim of developing usable software. Due to the current absence of a reference model that specifies where and how usability activities need to be considered in the agile development process. This paper proposes a model for identifying appropriate usability evaluation methods alongside the agile development process. By using this model, the development team can apply usability evaluations at the right time at the right place to get the necessary feedback from the end-user. Verificatio
... Show MoreThe current research creates an overall relative analysis concerning the estimation of Meixner process parameters via the wavelet packet transform. Of noteworthy presentation relevance, it compares the moment method and the wavelet packet estimator for the four parameters of the Meixner process. In this paper, the research focuses on finding the best threshold value using the square root log and modified square root log methods with the wavelet packets in the presence of noise to enhance the efficiency and effectiveness of the denoising process for the financial asset market signal. In this regard, a simulation study compares the performance of moment estimation and wavelet packets for different sample sizes. The results show that wavelet p
... Show More
In this paper, the using of Non-Homogenous Poisson Processes, with one of the scientific and practical means in the Operations Research had been carried out, which is the Queuing Theory, as those operations are affected by time in their conduct by one function which has a cyclic behavior, called the (Sinusoidal Function). (Mt / M / S) The model was chosen, and it is Single Queue Length with multiple service Channels, and using the estimating scales (QLs, HOL, HOLr) was carried out in considering the delay occurring to the customer before his entrance to the service, with the comparison of the best of them in the cases of the overload.
Through the experiments
... Show MoreIn this study, we used Bayesian method to estimate scale parameter for the normal distribution. By considering three different prior distributions such as the square root inverted gamma (SRIG) distribution and the non-informative prior distribution and the natural conjugate family of priors. The Bayesian estimation based on squared error loss function, and compared it with the classical estimation methods to estimate the scale parameter for the normal distribution, such as the maximum likelihood estimation and th
... Show More