The survival analysis is one of the modern methods of analysis that is based on the fact that the dependent variable represents time until the event concerned in the study. There are many survival models that deal with the impact of explanatory factors on the likelihood of survival, including the models proposed by the world, David Cox, one of the most important and common models of survival, where it consists of two functions, one of which is a parametric function that does not depend on the survival time and the other a nonparametric function that depends on times of survival, which the Cox model is defined as a semi parametric model, The set of parametric models that depend on the time-to-event distribution parameters such as Exponential Model, Weibull Model, Log-logistic Model. Our research aims to adopt some of the Bayesian Optimal Criteria in achieving optimal design to estimate the optimal survival time for patients with myocardial infarction by constructing a parametric survival model based on the probability distribution of the survival times of myocardial infarction patients, which is among the most serious diseases that threaten human life and the main cause of death all over the world, as the duration of survival of patients with myocardial infarction varies with the factor or factors causing the injury, there are many factors that lead to the disease such as diabetes, high blood pressure, high cholesterol, psychological pressure and obesity. Therefore, the need to estimate the optimal survival time was expressed by constructing a model of the relationship between the factors leading to the disease and the patient survival time, and we found that the optimal rate of survival time is 18 days.
This study discussed a biased estimator of the Negative Binomial Regression model known as (Liu Estimator), This estimate was used to reduce variance and overcome the problem Multicollinearity between explanatory variables, Some estimates were used such as Ridge Regression and Maximum Likelihood Estimators, This research aims at the theoretical comparisons between the new estimator (Liu Estimator) and the estimators
Given the importance of increasing economic openness transport companies’ face various issues arising at present time, this required importing different types of goods with different means of transport. Therefore, these companies pay great attention to reducing total costs of transporting commodities by using numbers means of transport methods from their sources to the destinations. The majority of private companies do not acquire the knowledge of using operations research methods, especially transport models, through which the total costs can be reduced, resulting in the importance and need to solve such a problem. This research presents a proposed method for the sum of Total Costs (Tc) of rows and columns, in order to arrive at the init
... Show MoreMixed-effects conditional logistic regression is evidently more effective in the study of qualitative differences in longitudinal pollution data as well as their implications on heterogeneous subgroups. This study seeks that conditional logistic regression is a robust evaluation method for environmental studies, thru the analysis of environment pollution as a function of oil production and environmental factors. Consequently, it has been established theoretically that the primary objective of model selection in this research is to identify the candidate model that is optimal for the conditional design. The candidate model should achieve generalizability, goodness-of-fit, parsimony and establish equilibrium between bias and variab
... Show MoreAbstract
Travel Time estimation and reliability measurement is an important issues for improving operation efficiency and safety of traffic roads networks. The aim of this research is the estimation of total travel time and distribution analysis for three selected links in Palestine Arterial Street in Baghdad city. Buffer time index results in worse reliability conditions. Link (2) from Bab Al Mutham intersection to Al-Sakara intersection produced a buffer index of about 36% and 26 % for Link (1) Al-Mawall intersection to Bab Al- Mutham intersection and finally for link (3) which presented a 24% buffer index. These illustrated that the reliability get worst for link
... Show MoreIn this study, we focused on the random coefficient estimation of the general regression and Swamy models of panel data. By using this type of data, the data give a better chance of obtaining a better method and better indicators. Entropy's methods have been used to estimate random coefficients for the general regression and Swamy of the panel data which were presented in two ways: the first represents the maximum dual Entropy and the second is general maximum Entropy in which a comparison between them have been done by using simulation to choose the optimal methods.
The results have been compared by using mean squares error and mean absolute percentage error to different cases in term of correlation valu
... Show MoreAbstract
Objective(s): A descriptive study aimed to determine nurses' knowledge about chest physiotherapy techniques for patients with Corona virus disease and observe the relationship between nurses' knowledge and their socio-demographic characteristics.
Methodology: The study was directed in isolation units of Al- Hussein teaching hospitals in Thi-Qar, Iraq for the period from June 1st, 2022 to November 27th, 2022. Non- probability (purposively) sample comprised 41 nurses. A questionnaire was used for data collection and it consists of two parts: the first part comprises socio demographic features, the second part includes self- administered questionnaire sheet wa
... Show MoreThe objective of this study is to examine the properties of Bayes estimators of the shape parameter of the Power Function Distribution (PFD-I), by using two different prior distributions for the parameter θ and different loss functions that were compared with the maximum likelihood estimators. In many practical applications, we may have two different prior information about the prior distribution for the shape parameter of the Power Function Distribution, which influences the parameter estimation. So, we used two different kinds of conjugate priors of shape parameter θ of the <
... Show MorePolish Academy of Sciences
NeighShrink is an efficient image denoising algorithm based on the discrete wavelet
transform (DWT). Its disadvantage is to use a suboptimal universal threshold and identical
neighbouring window size in all wavelet subbands. Dengwen and Wengang proposed an
improved method, which can determine an optimal threshold and neighbouring window size
for every subband by the Stein’s unbiased risk estimate (SURE). Its denoising performance is
considerably superior to NeighShrink and also outperforms SURE-LET, which is an up-todate
denoising algorithm based on the SURE. In this paper different wavelet transform
families are used with this improved method, the results show that Haar wavelet has the
lowest performance among
Research includes evaluation of projects implemented and which entered into trial operation period in accordance with the evaluation criteria and of (cost, quality and time) to determine the size deviations gap for the sample of projects during the years of assessment (2011-2012-2013-2014) of each of the three evaluation criteria, and then followed by a calculation the size of the overall gap to the problem based on the research problem to determine deviations from the specific implementation of each project by answering several questions to answer turns out the reasons for these deviations occur.
The importance of research Focus on the evaluation of received projects from contractors executing the projec
... Show More