In this article we study the variance estimator for the normal distribution when the mean is un known depend of the cumulative function between unbiased estimator and Bays estimator for the variance of normal distribution which is used include Double Stage Shrunken estimator to obtain higher efficiency for the variance estimator of normal distribution when the mean is unknown by using small volume equal volume of two sample .
The increasing availability of computing power in the past two decades has been use to develop new techniques for optimizing solution of estimation problem. Today's computational capacity and the widespread availability of computers have enabled development of new generation of intelligent computing techniques, such as our interest algorithm, this paper presents one of new class of stochastic search algorithm (known as Canonical Genetic' Algorithm ‘CGA’) for optimizing the maximum likelihood function strategy is composed of three main steps: recombination, mutation, and selection. The experimental design is based on simulating the CGA with different values of are compared with those of moment method. Based on MSE value obtained from bot
... Show MoreIn this paper ,the problem of point estimation for the two parameters of logistic distribution has been investigated using simulation technique. The rank sampling set estimator method which is one of the Non_Baysian procedure and Lindley approximation estimator method which is one of the Baysian method were used to estimate the parameters of logistic distribution. Comparing between these two mentioned methods by employing mean square error measure and mean absolute percentage error measure .At last simulation technique used to generate many number of samples sizes to compare between these methods.
One of the significant stages in computer vision is image segmentation which is fundamental for different applications, for example, robot control and military target recognition, as well as image analysis of remote sensing applications. Studies have dealt with the process of improving the classification of all types of data, whether text or audio or images, one of the latest studies in which researchers have worked to build a simple, effective, and high-accuracy model capable of classifying emotions from speech data, while several studies dealt with improving textual grouping. In this study, we seek to improve the classification of image division using a novel approach depending on two methods used to segment the images. The first
... Show MoreThis paper deals with two preys and stage-structured predator model with anti-predator behavior. Sufficient conditions that ensure the appearance of local and Hopf bifurcation of the system have been achieved, and it’s observed that near the free predator, the free second prey and the free first prey equilibrium points there are transcritical or pitchfork and no saddle node. While near the coexistence equilibrium point there is transcritical, pitchfork and saddle node bifurcation. For the Hopf bifurcation near the coexistence equilibrium point have been studied. Further, numerical analysis has been used to validate the main results.
Background: Little is known about asymmetry of children's dental arches, the purpose of this study was to verify the presence of asymmetry of dental arches among Iraqi children in the mixed dentition stage. Materials and methods: The sample included 52 pairs of dental casts, 27 pairs belong to males and 25 pairs for females. Three linear distances were utilized on each side on the dental arch: Incisal-canine distance, canine-molar distance and incisal-molar distance, which represent the dental arch segmental measurements using the digital sliding calipers, which is accurate up to 0.02 mm. Results: No significant sides' differences with high correlation coefficient were found between the right and left incisal-canine, canine-molar and in
... Show MoreThis Research Tries To Investigate The Problem Of Estimating The Reliability Of Two Parameter Weibull Distribution,By Using Maximum Likelihood Method, And White Method. The Comparison Is done Through Simulation Process Depending On Three Choices Of Models (?=0.8 , ß=0.9) , (?=1.2 , ß=1.5) and (?=2.5 , ß=2). And Sample Size n=10 , 70, 150 We Use the Statistical Criterion Based On the Mean Square Error (MSE) For Comparison Amongst The Methods.
The study aims to analyze computer textbooks content for preparatory stage according to the logical thinking. The researcher followed the descriptive analytical research approach (content analysis), and adopted an explicit idea during the analysis process. One of the content analysis tools which was designed based on mental processes employed during logical thinking has utilized to figure out the study results. The findings revealed that logical thinking skills formed (52%) in fourth preparatory textbook and (47%) in fifth preparatory textbook.