This deals with estimation of Reliability function and one shape parameter (?) of two- parameters Burr – XII , when ?(shape parameter is known) (?=0.5,1,1.5) and also the initial values of (?=1), while different sample shze n= 10, 20, 30, 50) bare used. The results depend on empirical study through simulation experiments are applied to compare the four methods of estimation, as well as computing the reliability function . The results of Mean square error indicates that Jacknif estimator is better than other three estimators , for all sample size and parameter values
YY Lazim, NAB Azizan, 2nd International Conference on Innovation and Entrepreneurship, 2014
Background: Orthodontic mini-implants are increasingly used in orthodontics and the bone density is a very important factor in stabilization and success of mini-implant. The aim of this study was to observe the relationship among maximum bite force (MBF); body mass index (BMI); face width, height and type; and bone density in an attempt to predict bone density from these variables to eliminate the need for CT scan which have a highly hazard on patient. Materials and Methods: Computed tomographic (CT) images were obtained for 70 patients (24 males and 46 females) with age range 18-30 years. The maxillary and mandibular buccal cortical and cancellous bone densities were measured between 2nd premolar and 1st molar at two levels from the alveol
... Show MoreAbstract
That Iraq's dependence on the revenues of the oil product in financing its development programs and growth rates , Making the economy affected by external forces represented by fluctuations in crude oil prices in the global market, Which is directly reflected on the performance and efficiency of the Iraqi economy.
The study adopted its objectives to analyze the time series for the period (1988 - 2015) through the use of standard and statistical methods, Four standard models were estimated to reach those targets, Where the results of the stability test showed instability of most variables at their original level, But to achieve stability when taking the first differences, While the result
... Show MoreThis study investigates the impact of spatial resolution enhancement on supervised classification accuracy using Landsat 9 satellite imagery, achieved through pan-sharpening techniques leveraging Sentinel-2 data. Various methods were employed to synthesize a panchromatic (PAN) band from Sentinel-2 data, including dimension reduction algorithms and weighted averages based on correlation coefficients and standard deviation. Three pan-sharpening algorithms (Gram-Schmidt, Principal Components Analysis, Nearest Neighbour Diffusion) were employed, and their efficacy was assessed using seven fidelity criteria. Classification tasks were performed utilizing Support Vector Machine and Maximum Likelihood algorithms. Results reveal that specifi
... Show MoreThe basis of the personality of each individual lies in the early years of his or her life. If the personality of the child has been well organized and if the motives have been fully expressed and effectively directed, the child will have a strong will, happy self-confidence and a strong personality. If there is a failure In the early years, the individual will be unable to meet his responsibilities in life and may be the victim of many psychological disorders. The family is a learning process through which children acquire the customs, traditions, attitudes and values prevailing in their social environment. (Pre-and-after) play and its relationship to parenting methods of (democratic-bullying-overprotection- and neglect), which wi
... Show MoreThe theory of probabilistic programming may be conceived in several different ways. As a method of programming it analyses the implications of probabilistic variations in the parameter space of linear or nonlinear programming model. The generating mechanism of such probabilistic variations in the economic models may be due to incomplete information about changes in demand, production and technology, specification errors about the econometric relations presumed for different economic agents, uncertainty of various sorts and the consequences of imperfect aggregation or disaggregating of economic variables. In this Research we discuss the probabilistic programming problem when the coefficient bi is random variable
... Show MoreCharge multipole Coulomb scattering form factors in 48Ca nucleus have been reproduced utilizing the theory of nuclear shell. The efficient two-body nuclear potential fpbm is considered to construct the-spin orbit term LS vectors with Harmonic Oscillator HO potential as a wave function of single particle in Fp shell. Discarded spaces ( core + higher configuration) are taken into account through the Core polarization effect by model space with accurate two-body potential of Gogny to interact the LS operating particles with the discarded space pair ( particle-hole) with energy of excitation equal to 2ћω. Gogny interaction has been selected as it had succeeded in nuclear shell theory. The computed results were compared with th
... Show MoreDesign sampling plan was and still one of most importance subjects because it give lowest cost comparing with others, time live statistical distribution should be known to give best estimators for parameters of sampling plan and get best sampling plan.
Research dell with design sampling plan when live time distribution follow Logistic distribution with () as location and shape parameters, using these information can help us getting (number of groups, sample size) associated with reject or accept the Lot
Experimental results for simulated data shows the least number of groups and sample size needs to reject or accept the Lot with certain probability of
... Show MoreIn this paper, preliminary test Shrinkage estimator have been considered for estimating the shape parameter α of pareto distribution when the scale parameter equal to the smallest loss and when a prior estimate α0 of α is available as initial value from the past experiences or from quaintance cases. The proposed estimator is shown to have a smaller mean squared error in a region around α0 when comparison with usual and existing estimators.