This paper deals with defining Burr-XII, and how to obtain its p.d.f., and CDF, since this distribution is one of failure distribution which is compound distribution from two failure models which are Gamma model and weibull model. Some equipment may have many important parts and the probability distributions representing which may be of different types, so found that Burr by its different compound formulas is the best model to be studied, and estimated its parameter to compute the mean time to failure rate. Here Burr-XII rather than other models is consider because it is used to model a wide variety of phenomena including crop prices, household income, option market price distributions, risk and travel time. It has two shape-parameters (α, r) and one scale parameter (λ) which is considered known. So, this paper defines the p.d.f. and CDF and derives its Moments formula about origin, and also derive the Moments estimators of two shapes parameters (α, r) in addition to maximum likelihood estimators as well as percentile estimators, the scale parameter (λ) is not estimated (as it is considered known). The comparison between three methods is done through simulation procedure taking different sample size (n=30, 60, 90) and different sets of initial values for (α, r, λ).It is observed that the moment estimators are the best estimator with percentage (46%) ,(42%) respectively compared with other estimators.
Linear discriminant analysis and logistic regression are the most widely used in multivariate statistical methods for analysis of data with categorical outcome variables .Both of them are appropriate for the development of linear classification models .linear discriminant analysis has been that the data of explanatory variables must be distributed multivariate normal distribution. While logistic regression no assumptions on the distribution of the explanatory data. Hence ,It is assumed that logistic regression is the more flexible and more robust method in case of violations of these assumptions.
In this paper we have been focus for the comparison between three forms for classification data belongs
... Show MoreOne of the most important problems in the statistical inference is estimating parameters and Reliability parameter and also interval estimation , and testing hypothesis . estimating two parameters of exponential distribution and also reliability parameter in a stress-strength model.
This parameter deals with estimating the scale parameter and the Location parameter µ , of two exponential distribution ,using moments estimator and maximum likelihood estimator , also we estimate the parameter R=pr(x>y), where x,y are two- parameter independent exponential random variables .
Statistical properties of this distribution and its properti
... Show MoreThis paper presents a statistical study for a suitable distribution of rainfall in the provinces of Iraq
Using two types of distributions for the period (2005-2015). The researcher suggested log normal distribution, Mixed exponential distribution of each rovince were tested with the distributions to determine the optimal distribution of rainfall in Iraq. The distribution will be selected on the basis of minimum standards produced some goodness of fit tests, which are to determine
Akaike (CAIC), Bayesian Akaike (BIC), Akaike (AIC). It has been applied to distributions to find the right distribution of the data of rainfall in the provinces of Iraq was used (maximu
... Show MoreIn this paper, the Monte-Carlo simulation method was used to compare the robust circular S estimator with the circular Least squares method in the case of no outlier data and in the case of the presence of an outlier in the data through two trends, the first is contaminant with high inflection points that represents contaminant in the circular independent variable, and the second the contaminant in the vertical variable that represents the circular dependent variable using three comparison criteria, the median standard error (Median SE), the median of the mean squares of error (Median MSE), and the median of the mean cosines of the circular residuals (Median A(k)). It was concluded that the method of least squares is better than the
... Show MoreThe theory of probabilistic programming may be conceived in several different ways. As a method of programming it analyses the implications of probabilistic variations in the parameter space of linear or nonlinear programming model. The generating mechanism of such probabilistic variations in the economic models may be due to incomplete information about changes in demand, production and technology, specification errors about the econometric relations presumed for different economic agents, uncertainty of various sorts and the consequences of imperfect aggregation or disaggregating of economic variables. In this Research we discuss the probabilistic programming problem when the coefficient bi is random variable
... Show MoreIn this paper, some commonly used hierarchical cluster techniques have been compared. A comparison was made between the agglomerative hierarchical clustering technique and the k-means technique, which includes the k-mean technique, the variant K-means technique, and the bisecting K-means, although the hierarchical cluster technique is considered to be one of the best clustering methods. It has a limited usage due to the time complexity. The results, which are calculated based on the analysis of the characteristics of the cluster algorithms and the nature of the data, showed that the bisecting K-means technique is the best compared to the rest of the other methods used.
This paper includes a comparison between denoising techniques by using statistical approach, principal component analysis with local pixel grouping (PCA-LPG), this procedure is iterated second time to further improve the denoising performance, and other enhancement filters were used. Like adaptive Wiener low pass-filter to a grayscale image that has been degraded by constant power additive noise, based on statistics estimated from a local neighborhood of each pixel. Performs Median filter of the input noisy image, each output pixel contains the Median value in the M-by-N neighborhood around the corresponding pixel in the input image, Gaussian low pass-filter and Order-statistic filter also be used.
Experimental results shows LPG-
... Show MoreThis paper includes a comparison between denoising techniques by using statistical approach, principal component analysis with local pixel grouping (PCA-LPG), this procedure is iterated second time to further improve the denoising performance, and other enhancement filters were used. Like adaptive Wiener low pass-filter to a grayscale image that has been degraded by constant power additive noise, based on statistics estimated from a local neighborhood of each pixel. Performs Median filter of the input noisy image, each output pixel contains the Median value in the M-by-N neighborhood around the corresponding pixel in the input image, Gaussian low pass-filter and Order-statistic filter also be used. Experimental results shows LPG-PCA method
... Show MoreSome of the issues that have become common in our society recently after the Americans entered our country and were rubbed by some security agencies: obtaining some information from children, and the serious consequences that may lead to the lives of innocent people, became common interrogation of some security agencies and rely on their words.
There are significant cases where their testimony needs to be heard, such as their presence in some places where incidents are not witnessed by others, such as schools or being witnesses to certain crimes.
I saw the study of this case in the light of Sharia and law