In this paper, we derived an estimator of reliability function for Laplace distribution with two parameters using Bayes method with square error loss function, Jeffery’s formula and conditional probability random variable of observation. The main objective of this study is to find the efficiency of the derived Bayesian estimator compared to the maximum likelihood of this function and moment method using simulation technique by Monte Carlo method under different Laplace distribution parameters and sample sizes. The consequences have shown that Bayes estimator has been more efficient than the maximum likelihood estimator and moment estimator in all samples sizes
In this research, several estimators concerning the estimation are introduced. These estimators are closely related to the hazard function by using one of the nonparametric methods namely the kernel function for censored data type with varying bandwidth and kernel boundary. Two types of bandwidth are used: local bandwidth and global bandwidth. Moreover, four types of boundary kernel are used namely: Rectangle, Epanechnikov, Biquadratic and Triquadratic and the proposed function was employed with all kernel functions. Two different simulation techniques are also used for two experiments to compare these estimators. In most of the cases, the results have proved that the local bandwidth is the best for all the
... Show More
It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jackna
... Show More
It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jackna
... Show MoreHuman interaction technology based on motion capture (MoCap) systems is a vital tool for human kinematics analysis, with applications in clinical settings, animations, and video games. We introduce a new method for analyzing and estimating dorsal spine movement using a MoCap system. The captured data by the MoCap system are processed and analyzed to estimate the motion kinematics of three primary regions; the shoulders, spine, and hips. This work contributes a non-invasive and anatomically guided framework that enables region-specific analysis of spinal motion which could be used as a clinical alternative to invasive measurement techniques. The hierarchy of our model consists of five main levels; motion capture system settings, marker data
... Show MoreIn this paper, some commonly used hierarchical cluster techniques have been compared. A comparison was made between the agglomerative hierarchical clustering technique and the k-means technique, which includes the k-mean technique, the variant K-means technique, and the bisecting K-means, although the hierarchical cluster technique is considered to be one of the best clustering methods. It has a limited usage due to the time complexity. The results, which are calculated based on the analysis of the characteristics of the cluster algorithms and the nature of the data, showed that the bisecting K-means technique is the best compared to the rest of the other methods used.
this research aims at a number of objectives including Developing the tax examination process and raise its efficiency without relying on comprehensive examination method using some statistical methods in the tax examination and Discussing the most important concepts related to the statistical methods used in the tax examination and showing its importance and how they are applied. the research represents an applied study in the General Commission of taxes. In order to achieve its objectives the research has used in the theoretical side the descriptive approach (analytical), and in the practical side Some statistical methods applied to the sample of the final accounts for the contracting company (limited) and the pharmaceutical industry (
... Show MoreAn experiment was carried out in the vegetables field of Horticulture Department / College of Agriculture / Baghdad University , for the three seasons : spring and Autumn of 2005 , and spring of 2007 , to study the type of gene action in some traits of vegetative and flowery growth in summer squash crosses (4 x 3 = cross 1 , 3 x 7 = cross 2 , 3 x 4 = cross 3 , 3 x 5 = cross 4 , 5 x 1 = cross 5 , 5 x 2 = cross 6). The study followed generation mean analysis method which included to each cross (P1 , P2 , F1 , F2 , Bc1P1 , Bc1P2) , and those populations obtained by hybridization during the first and second seasons. Experimental comparison was performed in the second (Two crosses only) and third seasons , (four crosses) by using RCBD with three
... Show MoreAn experiment was carried out in the vegetables field of Horticulture Department / College of Agriculture / Baghdad University , for the three seasons : spring and Autumn of 2005 , and spring of 2007 , to study the type of gene action in some traits of yield and its components in summer squash crosses (4 x 3 = cross 1 , 3 x 7 = cross 2 , 3 x 4 = cross 3 , 3 x 5 = cross 4 , 5 x 1 = cross 5 , 5 x 2 = cross 6). The study followed generation mean analysis method which included to each cross (P1 , P2 , F1 , F2 , Bc1P1 , Bc1P2) , and those populations obtained by hybridization during the first and second seasons. Experimental comparison was performed in the second (Two crosses only) and third seasons , (four crosses) by using RCBD with three repl
... Show MoreThe aim of this paper is to design artificial neural network as an alternative accurate tool to estimate concentration of Cadmium in contaminated soils for any depth and time. First, fifty soil samples were harvested from a phytoremediated contaminated site located in Qanat Aljaeesh in Baghdad city in Iraq. Second, a series of measurements were performed on the soil samples. The inputs are the soil depth, the time, and the soil parameters but the output is the concentration of Cu in the soil for depth x and time t. Third, design an ANN and its performance was evaluated using a test data set and then applied to estimate the concentration of Cadmium. The performance of the ANN technique was compared with the traditional laboratory inspecting
... Show Moresummary
In this search, we examined the factorial experiments and the study of the significance of the main effects, the interaction of the factors and their simple effects by the F test (ANOVA) for analyze the data of the factorial experience. It is also known that the analysis of variance requires several assumptions to achieve them, Therefore, in case of violation of one of these conditions we conduct a transform to the data in order to match or achieve the conditions of analysis of variance, but it was noted that these transfers do not produce accurate results, so we resort to tests or non-parametric methods that work as a solution or alternative to the parametric tests , these method
... Show More