The survival analysis is one of the modern methods of analysis that is based on the fact that the dependent variable represents time until the event concerned in the study. There are many survival models that deal with the impact of explanatory factors on the likelihood of survival, including the models proposed by the world, David Cox, one of the most important and common models of survival, where it consists of two functions, one of which is a parametric function that does not depend on the survival time and the other a nonparametric function that depends on times of survival, which the Cox model is defined as a semi parametric model, The set of parametric models that depend on the time-to-event distribution parameters such as Exponential Model, Weibull Model, Log-logistic Model. Our research aims to adopt some of the Bayesian Optimal Criteria in achieving optimal design to estimate the optimal survival time for patients with myocardial infarction by constructing a parametric survival model based on the probability distribution of the survival times of myocardial infarction patients, which is among the most serious diseases that threaten human life and the main cause of death all over the world, as the duration of survival of patients with myocardial infarction varies with the factor or factors causing the injury, there are many factors that lead to the disease such as diabetes, high blood pressure, high cholesterol, psychological pressure and obesity. Therefore, the need to estimate the optimal survival time was expressed by constructing a model of the relationship between the factors leading to the disease and the patient survival time, and we found that the optimal rate of survival time is 18 days.
A mixture model is used to model data that come from more than one component. In recent years, it became an effective tool in drawing inferences about the complex data that we might come across in real life. Moreover, it can represent a tremendous confirmatory tool in classification observations based on similarities amongst them. In this paper, several mixture regression-based methods were conducted under the assumption that the data come from a finite number of components. A comparison of these methods has been made according to their results in estimating component parameters. Also, observation membership has been inferred and assessed for these methods. The results showed that the flexible mixture model outperformed the
... Show MoreThis study aimed to improve the microencapsulation technique using a type coating the encapsulation Layer by Layer, which provide the best protection for life Lactobacillus casei in the extrusion method and use the microencapsulation of materials of the protein concentrated by protein 80% and the coating with alginate and chitosan have the results showed the variation in the difference of the binding process encapsulation yield among the types of coating through. by studying of these the effect o stability of the bio probiotic free cell and the three types coated towards three different concentrations from bile salts 0, 0.3, 0.5 and 0.7% when the periods of time different of zero and two and three hours at incubation the recorded
... Show MoreIn this research, we find the Bayesian formulas and the estimation of Bayesian expectation for product system of Atlas Company. The units of the system have been examined by helping the technical staff at the company and by providing a real data the company which manufacturer the system. This real data include the failed units for each drawn sample, which represents the total number of the manufacturer units by the company system. We calculate the range for each estimator by using the Maximum Likelihood estimator. We obtain that the expectation-Bayesian estimation is better than the Bayesian estimator of the different partially samples which were drawn from the product system after it checked by the
... Show MoreTransforming the common normal distribution through the generated Kummer Beta model to the Kummer Beta Generalized Normal Distribution (KBGND) had been achieved. Then, estimating the distribution parameters and hazard function using the MLE method, and improving these estimations by employing the genetic algorithm. Simulation is used by assuming a number of models and different sample sizes. The main finding was that the common maximum likelihood (MLE) method is the best in estimating the parameters of the Kummer Beta Generalized Normal Distribution (KBGND) compared to the common maximum likelihood according to Mean Squares Error (MSE) and Mean squares Error Integral (IMSE) criteria in estimating the hazard function. While the pr
... Show MoreUsed in the study especially calibrated Erwa to determine the number of neighborhood or the Alayoshi number of bacteria in the count modeling and casting method dishes in addition to using the drop method yielded significant results for a match between the methods used ..
This paper includes the application of Queuing theory with of Particle swarm algorithm or is called (Intelligence swarm) to solve the problem of The queues and developed for General commission for taxes /branch Karkh center in the service stage of the Department of calculators composed of six employees , and it was chosen queuing model is a single-service channel M / M / 1 according to the nature of the circuit work mentioned above and it will be divided according to the letters system for each employee, and it was composed of data collection times (arrival time , service time, departure time)
... Show More
Accuracy in multiple objects segmentation using geometric deformable models sometimes is not achieved for reasons relating to a number of parameters. In this research, we will study the effect of changing the parameters values on the work of the geometric deformable model and define their efficient values, as well as finding out the relations that link these parameters with each other, by depending on different case studies including multiple objects different in spacing, colors, and illumination. For specific ranges of parameters values the segmentation results are found good, where the success of the work of geometric deformable models has been limited within certain limits to the values of these parameters.
The compressive residual stresses generated by shot peening, is increased in a direct proportional way with shot peening time (SPT). For each metal, there is an optimum shot peening time (O.S.T) which gives the optimum fatigue life. This paper experimentally studied to optimize shot peening time of aluminium alloy 6061-T651 as well as using of and analysis of variance (ANOVA).
Two types of fatigue test specimens’ configuration were used, one without notch (smooth) and the other with a notch radius (1,25mm), each type was shot peened at different time. The (O.S.T) was experimentally estimated to be 8 minutes reaching the surface stresses at maximum peak of -184.94 MPa.
A response surface methodology (RSM) is presen
... Show MoreBackground: Ejection fraction have been used frequently
for assessment of the left ventricular function, but can be
associated with errors in which myocardial performance
index have been used as another parameter to measure the
left ventricular function.
Objective: selecting another echocardiography parameter
for the assessment of myocardial in function instead of the
ejection fraction.
Methods: 160 patients referred to the echocardiogram unit
from the period december 2007 to august 2008 requesting
assessment of left ventricular function. After clinical
examination, routine blood tests; chest x-ray and
electrocardiographic recording have been completed. All
patients informed to come for this unit af
In this paper, an estimate has been made for parameters and the reliability function for Transmuted power function (TPF) distribution through using some estimation methods as proposed new technique for white, percentile, least square, weighted least square and modification moment methods. A simulation was used to generate random data that follow the (TPF) distribution on three experiments (E1 , E2 , E3) of the real values of the parameters, and with sample size (n=10,25,50 and 100) and iteration samples (N=1000), and taking reliability times (0< t < 0) . Comparisons have been made between the obtained results from the estimators using mean square error (MSE). The results showed the
... Show More