In this paper, we derived an estimators and parameters of Reliability and Hazard function of new mix distribution ( Rayleigh- Logarithmic) with two parameters and increasing failure rate using Bayes Method with Square Error Loss function and Jeffery and conditional probability random variable of observation. The main objective of this study is to find the efficiency of the derived of Bayesian estimator compared to the to the Maximum Likelihood of this function using Simulation technique by Monte Carlo method under different Rayleigh- Logarithmic parameter and sample sizes. The consequences have shown that Bayes estimator has been more efficient than the maximum likelihood estimator in all sample sizes with application
In this paper, we made comparison among different parametric ,nonparametric and semiparametric estimators for partial linear regression model users parametric represented by ols and nonparametric methods represented by cubic smoothing spline estimator and Nadaraya-Watson estimator, we study three nonparametric regression models and samples sizes n=40,60,100,variances used σ2=0.5,1,1.5 the results for the first model show that N.W estimator for partial linear regression model(PLM) is the best followed the cubic smoothing spline estimator for (PLM),and the results of the second and the third model show that the best estimator is C.S.S.followed by N.W estimator for (PLM) ,the
... Show MoreIn this research the Empirical Bayes method is used to Estimate the affiliation parameter in the clinical trials and then we compare this with the Moment Estimates for this parameter using Monte Carlo stimulation , we assumed that the distribution of the observation is binomial distribution while the distribution with the unknown random parameters is beta distribution ,finally we conclude that the Empirical bayes method for the random affiliation parameter is efficient using Mean Squares Error (MSE) and for different Sample size .
This paper presents a hybrid software copy protection scheme, the scheme is applied to
prevent illegal copying of software by produce a license key which is unique and easy to
generate. This work employs the uniqueness of identification of hard disk in personal
computer which can get by software to create a license key after treated with SHA-1 one way
hash function. Two mean measures are used to evaluate the proposed method, complexity
and processing time, SHA-1 can insure the high complexity to deny the hackers for produce
unauthorized copies, many experiments have been executed using different sizes of software
to calculate the consuming time. The measures show high complexity and short execution
time for propos
Porous Silicon (PS) layer has been prepared from p-type silicon by electrochemical etching method. The morphology properties of PS samples that prepared with different current density has been study using atom force measurement (AFM) and it show that the Layer of pore has sponge like stricture and the average pore diameter of PS layer increase with etching current density increase .The x-ray diffraction (XRD) pattern indicated the nanocrystaline of the sample. Reflectivity of the sample surface is decrease when etching current density increases because of porosity increase on surface of sample. The photolumenses (PL) intensity increase with increase etching current density. The PL is affected by relative humidity (RH) level so we can use
... Show MoreGumbel distribution was dealt with great care by researchers and statisticians. There are traditional methods to estimate two parameters of Gumbel distribution known as Maximum Likelihood, the Method of Moments and recently the method of re-sampling called (Jackknife). However, these methods suffer from some mathematical difficulties in solving them analytically. Accordingly, there are other non-traditional methods, like the principle of the nearest neighbors, used in computer science especially, artificial intelligence algorithms, including the genetic algorithm, the artificial neural network algorithm, and others that may to be classified as meta-heuristic methods. Moreover, this principle of nearest neighbors has useful statistical featu
... Show MoreDuring the winter, in the industry region (Shaikh Omer) and by applying a passive radon detector (CR-39), lung cancer risk has been measured in twelve rooms of different workshops of two old factories in this site. The radon concentration is ranged from (123.345 Bq/m3) to (328.985 Bq/m3) with an average of (244.19±61.52 Bq/m3). Lung cancer risk ranged from 55.993 to 149.346 per million people and with an average of (110.855 per million people) which were lower than the recommended values (170-230 per million people), so there was no cancer risk on workers in these locations.
In this paper, simulation studies and applications of the New Weibull-Inverse Lomax (NWIL) distribution were presented. In the simulation studies, different sample sizes ranging from 30, 50, 100, 200, 300, to 500 were considered. Also, 1,000 replications were considered for the experiment. NWIL is a fat tail distribution. Higher moments are not easily derived except with some approximations. However, the estimates have higher precisions with low variances. Finally, the usefulness of the NWIL distribution was illustrated by fitting two data sets