This Research Tries To Investigate The Problem Of Estimating The Reliability Of Two Parameter Weibull Distribution,By Using Maximum Likelihood Method, And White Method. The Comparison Is done Through Simulation Process Depending On Three Choices Of Models (?=0.8 , ß=0.9) , (?=1.2 , ß=1.5) and (?=2.5 , ß=2). And Sample Size n=10 , 70, 150 We Use the Statistical Criterion Based On the Mean Square Error (MSE) For Comparison Amongst The Methods.
This paper deals with defining Burr-XII, and how to obtain its p.d.f., and CDF, since this distribution is one of failure distribution which is compound distribution from two failure models which are Gamma model and weibull model. Some equipment may have many important parts and the probability distributions representing which may be of different types, so found that Burr by its different compound formulas is the best model to be studied, and estimated its parameter to compute the mean time to failure rate. Here Burr-XII rather than other models is consider because it is used to model a wide variety of phenomena including crop prices, household income, option market price distributions, risk and travel time. It has two shape-parame
... Show MoreAbstract
The Phenomenon of Extremism of Values (Maximum or Rare Value) an important phenomenon is the use of two techniques of sampling techniques to deal with this Extremism: the technique of the peak sample and the maximum annual sampling technique (AM) (Extreme values, Gumbel) for sample (AM) and (general Pareto, exponential) distribution of the POT sample. The cross-entropy algorithm was applied in two of its methods to the first estimate using the statistical order and the second using the statistical order and likelihood ratio. The third method is proposed by the researcher. The MSE comparison coefficient of the estimated parameters and the probability density function for each of the distributions were
... Show MoreThe effect of the concentration of the colloidal nanomaterial on their optical limiting behavior is reported in this paper. The colloids of sliver nanoparticles in deionized water were chemically prepared for the two concentrations (31 ppm and 11ppm). Two cw lasers (473 nm Blue DPSS laser and 532 nm Nd:YAG laser) are used to compare the optical limiting performance for the samples. UV–visible spectrophotometer, transmission electron microscope (TEM) and Fourier Transformation Infrared Spectrometer (FTIR) were used to obtain the characteristics of the sample. The nonlinear refractive index was calculated to be in the order of 10-9 cm2/W. The results demonstrate that the observed limiting response is significant for 532nm. In addition, t
... Show More
It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jackna
... Show More
It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jackna
... Show MoreBackground: The primary stability of the dental implant is a crucial factor determining the ability to initiate temporary implant-supported prosthesis and for subsequent successful osseointegration, especially in the maxillary non-molar sites. This study assessed the reliability of the insertion torque of dental implants by relating it to the implant stability quotient values measured by the Osstell device. Material and methods: This study included healthy, non-smoker patients with no history of diabetes or other metabolic, or debilitating diseases that may affect bone healing, having non-restorable fractured teeth and retained roots in the maxillary non-molar sites. Primary dental implant stability was evaluated using a torque ratc
... Show MoreThis paper is concerned with preliminary test double stage shrinkage estimators to estimate the variance (s2) of normal distribution when a prior estimate of the actual value (s2) is a available when the mean is unknown , using specifying shrinkage weight factors y(×) in addition to pre-test region (R).
Expressions for the Bias, Mean squared error [MSE (×)], Relative Efficiency [R.EFF (×)], Expected sample size [E(n/s2)] and percentage of overall sample saved of proposed estimator were derived. Numerical results (using MathCAD program) and conclusions are drawn about selection of different constants including in the me
... Show MoreThis investigation proposed an identification system of offline signature by utilizing rotation compensation depending on the features that were saved in the database. The proposed system contains five principle stages, they are: (1) data acquisition, (2) signature data file loading, (3) signature preprocessing, (4) feature extraction, and (5) feature matching. The feature extraction includes determination of the center point coordinates, and the angle for rotation compensation (θ), implementation of rotation compensation, determination of discriminating features and statistical condition. During this work seven essential collections of features are utilized to acquire the characteristics: (i) density (D), (ii) average (A), (iii) s
... Show MoreEntropy define as uncertainty measure has been transfared by using the cumulative distribution function and reliability function for the Burr type – xii. In the case of data which suffer from volatility to build a model the probability distribution on every failure of a sample after achieving limitations function, probabilistic distribution. Has been derived formula probability distribution of the new transfer application entropy on the probability distribution of continuous Burr Type-XII and tested a new function and found that it achieved the conditions function probability, been derived mean and function probabilistic aggregate in order to be approved in the generation of data for the purpose of implementation of simulation
... Show More