Background:No previous Iraqi study was done on the estimation of post mortem interval (PMI) from the medico-legal point of view; depending on the biochemical changes of vitreous humor.Objectives:To find out the relationship between some biochemical changes in vitreous humor and post mortem interval.To find out a new formula for estimation of PMI from some biochemical changes in vitreous humor.Method:The study was conducted on one hundred twenty two cases referred to the medico-legal institute in Sulaimani province during the period between 1st of February and 30th of July 2012.Complete classical autopsy was performed for each case and vitreous humor was collected at autopsy from the posterior chamber of the eye and the samples after collection were immediately transported for biochemical analysis.Only crystal clear vitreous humor was used for analysis.Results:With increasing postmortem interval; the vitreous humor potassium (K+) and calcium (Ca++) were increased. The changes of potassium and calcium were significantly correlated with the postmortem interval. The studied changes in chemical components of vitreous humor after death revealed that potassium had the best linear correlation with the postmortem interval within 40 hours after death and can be estimated by the following equation: (PMI=3.36[k+]-14.35)with standard deviation of±7.44hours.Conclusion:The study showed that vitreous potassium can precisely be used for estimating PMI and proposed a new formula for estimation of PMI which is PMI=3.36[K+]-14.35 that can be used for up to 40 hours with standard deviation of ±7.44hours.
In this paper we estimate the coefficients and scale parameter in linear regression model depending on the residuals are of type 1 of extreme value distribution for the largest values . This can be regard as an improvement for the studies with the smallest values . We study two estimation methods ( OLS & MLE ) where we resort to Newton – Raphson (NR) and Fisher Scoring methods to get MLE estimate because the difficulty of using the usual approach with MLE . The relative efficiency criterion is considered beside to the statistical inference procedures for the extreme value regression model of type 1 for largest values . Confidence interval , hypothesis testing for both scale parameter and regression coefficients
... Show MoreIn this article, we developed a new loss function, as the simplification of linear exponential loss function (LINEX) by weighting LINEX function. We derive a scale parameter, reliability and the hazard functions in accordance with upper record values of the Lomax distribution (LD). To study a small sample behavior performance of the proposed loss function using a Monte Carlo simulation, we make a comparison among maximum likelihood estimator, Bayesian estimator by means of LINEX loss function and Bayesian estimator using square error loss (SE) function. The consequences have shown that a modified method is the finest for valuing a scale parameter, reliability and hazard functions.
Shear and compressional wave velocities, coupled with other petrophysical data, are vital in determining the dynamic modules magnitude in geomechanical studies and hydrocarbon reservoir characterization. But, due to field practices and high running cost, shear wave velocity may not available in all wells. In this paper, a statistical multivariate regression method is presented to predict the shear wave velocity for Khasib formation - Amara oil fields located in South- East of Iraq using well log compressional wave velocity, neutron porosity and density. The accuracy of the proposed correlation have been compared to other correlations. The results show that, the presented model provides accurate
... Show MoreThe objective of this study is to examine the properties of Bayes estimators of the shape parameter of the Power Function Distribution (PFD-I), by using two different prior distributions for the parameter θ and different loss functions that were compared with the maximum likelihood estimators. In many practical applications, we may have two different prior information about the prior distribution for the shape parameter of the Power Function Distribution, which influences the parameter estimation. So, we used two different kinds of conjugate priors of shape parameter θ of the <
... Show MoreThe most significant function in oil exploration is determining the reservoir facies, which are based mostly on the primary features of rocks. Porosity, water saturation, and shale volume as well as sonic log and Bulk density are the types of input data utilized in Interactive Petrophysics software to compute rock facies. These data are used to create 15 clusters and four groups of rock facies. Furthermore, the accurate matching between core and well-log data is established by the neural network technique. In the current study, to evaluate the applicability of the cluster analysis approach, the result of rock facies from 29 wells derived from cluster analysis were utilized to redistribute the petrophysical properties for six units of Mishri
... Show MoreA mathematical model has been introduced to investigate the effect of nuclear reaction constant ( A ), probability of the BEC ground state occupation Ω i, nD is the number density of deuteron (d) and the overall number of nuclei ND on the total nuclear d-d fusion rate (R). Under steady-state of the condensates of Bose-Einstein, the postulate of quantum theory and Bose-Einstein theory were applied to evaluate the total nuclear (d-d) fusion rate trapping in Nickel-metal The total nuclear fusion rate trapping predicts a strong relationship between astrophysical S-factor and masses of Nickel. The reaction rate trapping model was tested on three reaction d(d,p)T, d(d, n)3He and d(d, 4He)Q = 23.8MeV respectively. The reaction rate has described
... Show Moren this research, several estimators concerning the estimation are introduced. These estimators are closely related to the hazard function by using one of the nonparametric methods namely the kernel function for censored data type with varying bandwidth and kernel boundary. Two types of bandwidth are used: local bandwidth and global bandwidth. Moreover, four types of boundary kernel are used namely: Rectangle, Epanechnikov, Biquadratic and Triquadratic and the proposed function was employed with all kernel functions. Two different simulation techniques are also used for two experiments to compare these estimators. In most of the cases, the results have proved that the local bandwidth is the best for all the types of the kernel boundary func
... Show MoreThis research deals with a shrinking method concernes with the principal components similar to that one which used in the multiple regression “Least Absolute Shrinkage and Selection: LASS”. The goal here is to make an uncorrelated linear combinations from only a subset of explanatory variables that may have a multicollinearity problem instead taking the whole number say, (K) of them. This shrinkage will force some coefficients to equal zero, after making some restriction on them by some "tuning parameter" say, (t) which balances the bias and variance amount from side, and doesn't exceed the acceptable percent explained variance of these components. This had been shown by MSE criterion in the regression case and the percent explained v
... Show MoreThe survival analysis is one of the modern methods of analysis that is based on the fact that the dependent variable represents time until the event concerned in the study. There are many survival models that deal with the impact of explanatory factors on the likelihood of survival, including the models proposed by the world, David Cox, one of the most important and common models of survival, where it consists of two functions, one of which is a parametric function that does not depend on the survival time and the other a nonparametric function that depends on times of survival, which the Cox model is defined as a semi parametric model, The set of parametric models that depend on the time-to-event distribution parameters such as
... Show More