The survival analysis is one of the modern methods of analysis that is based on the fact that the dependent variable represents time until the event concerned in the study. There are many survival models that deal with the impact of explanatory factors on the likelihood of survival, including the models proposed by the world, David Cox, one of the most important and common models of survival, where it consists of two functions, one of which is a parametric function that does not depend on the survival time and the other a nonparametric function that depends on times of survival, which the Cox model is defined as a semi parametric model, The set of parametric models that depend on the time-to-event distribution parameters such as Exponential Model, Weibull Model, Log-logistic Model. Our research aims to adopt some of the Bayesian Optimal Criteria in achieving optimal design to estimate the optimal survival time for patients with myocardial infarction by constructing a parametric survival model based on the probability distribution of the survival times of myocardial infarction patients, which is among the most serious diseases that threaten human life and the main cause of death all over the world, as the duration of survival of patients with myocardial infarction varies with the factor or factors causing the injury, there are many factors that lead to the disease such as diabetes, high blood pressure, high cholesterol, psychological pressure and obesity. Therefore, the need to estimate the optimal survival time was expressed by constructing a model of the relationship between the factors leading to the disease and the patient survival time, and we found that the optimal rate of survival time is 18 days.
We have provided in this research model multi assignment with fuzzy function goal has been to build programming model is correct Integer Programming fogging after removing the case from the objective function data and convert it to real data .Pascal triangular graded mean using Pascal way to the center of the triangular.
The data processing to get rid of the case fogging which is surrounded by using an Excel 2007 either model multi assignment has been used program LNDO to reach the optimal solution, which represents less than what can be from time to accomplish a number of tasks by the number of employees on the specific amount of the Internet, also included a search on some of the
... Show MoreAbstract
This study investigated the optimization of wear behavior of AISI 4340 steel based on the Taguchi method under various testing conditions. In this paper, a neural network and the Taguchi design method have been implemented for minimizing the wear rate in 4340 steel. A back-propagation neural network (BPNN) was developed to predict the wear rate. In the development of a predictive model, wear parameters like sliding speed, applying load and sliding distance were considered as the input model variables of the AISI 4340 steel. An analysis of variance (ANOVA) was used to determine the significant parameter affecting the wear rate. Finally, the Taguchi approach was applied to determine
... Show MoreThe present study aimed to identify the extent to which the content of social and national studies courses was included in interactive thinking maps in the educational stages in the Kingdom of Saudi Arabia, and to achieve the goal of the study, the researcher used the descriptive and analytical approach, and the study tool used consisted of a content analysis card; Where it included a list of the types of thinking maps, where the study sample consisted of all social and national studies courses at the elementary and intermediate levels, and it is (12) books for the student in its first and second parts, and after verifying the validity and reliability of the tool, it was applied to the study sample, and the study reached conclusions, inc
... Show MoreThe aim of this research is to use robust technique by trimming, as the analysis of maximum likelihood (ML) often fails in the case of outliers in the studied phenomenon. Where the (MLE) will lose its advantages because of the bad influence caused by the Outliers. In order to address this problem, new statistical methods have been developed so as not to be affected by the outliers. These methods have robustness or resistance. Therefore, maximum trimmed likelihood: (MTL) is a good alternative to achieve more results. Acceptability and analogies, but weights can be used to increase the efficiency of the resulting capacities and to increase the strength of the estimate using the maximum weighted trimmed likelihood (MWTL). In order to perform t
... Show MoreThe valley Dwiridj of drainage basins task that lies east of Iraq and thus we have in this study the application of tow models athletes on the three basins of the valley to get Mor e values accurate to Estimate the volume of runoff and peak discharge and time climax and through the use of Technology remote sensing (GIS),has been show through the application of both models, that the maximum value for the amount of Dwiridj valley of (1052/m3/s) According to Equation (SCS-CN) and about (1370.2/m3/s)by approach (GIUH) that difference is the amount of discharge to the Equation (SCS-CN) ar not accurate as(GIUH) approaches Equation ecalling the results of the Field ces Department of damand reservoirs that the volume of runoff to the valley wase
... Show MoreThe use of Bayesian approach has the promise of features indicative of regression analysis model classification tree to take advantage of the above information by, and ensemble trees for explanatory variables are all together and at every stage on the other. In addition to obtaining the subsequent information at each node in the construction of these classification tree. Although bayesian estimates is generally accurate, but it seems that the logistic model is still a good competitor in the field of binary responses through its flexibility and mathematical representation. So is the use of three research methods data processing is carried out, namely: logistic model, and model classification regression tree, and bayesian regression tree mode
... Show MoreAbstract :
In view of the fact that high blood pressure is one of the serious human diseases that a person can get without having to feel them, which is caused by many reasons therefore it became necessary to do research in this subject and to express these many factors by specific causes through studying it using (factor analysis).
So the researcher got to the five factors that explains only 71% of the total variation in this phenomenon is the subject of the research, where ((overweight)) and ((alcohol in abundance)) and ((smoking)) and ((lack of exercise)) are the reasons that influential the most in the incidence of this disease.
Statistical methods of forecasting have applied with the intention of constructing a model to predict the number of the old aged people in retirement homes in Iraq. They were based on the monthly data of old aged people in Baghdad and the governorates except for the Kurdistan region from 2016 to 2019. Using BoxJenkins methodology, the stationarity of the series was examined. The appropriate model order was determined, the parameters were estimated, the significance was tested, adequacy of the model was checked, and then the best model of prediction was used. The best model for forecasting according to criteria of (Normalized BIC, MAPE, RMSE) is ARIMA (0, 1, 2)
Statistical methods of forecasting have applied with the intention of constructing a model to predict the number of the old aged people in retirement homes in Iraq. They were based on the monthly data of old aged people in Baghdad and the governorates except for the Kurdistan region from 2016 to 2019. Using Box-Jenkins methodology, the stationarity of the series was examined. The appropriate model order was determined, the parameters were estimated, the significance was tested, adequacy of the model was checked, and then the best model of prediction was used. The best model for forecasting according to criteria of (Normalized BIC, MAPE, RMSE) is ARIMA (0, 1, 2).