Survival analysis is one of the types of data analysis that describes the time period until the occurrence of an event of interest such as death or other events of importance in determining what will happen to the phenomenon studied. There may be more than one endpoint for the event, in which case it is called Competing risks. The purpose of this research is to apply the dynamic approach in the analysis of discrete survival time in order to estimate the effect of covariates over time, as well as modeling the nonlinear relationship between the covariates and the discrete hazard function through the use of the multinomial logistic model and the multivariate Cox model. For the purpose of conducting the estimation process for both the discrete hazard function and the time-dependent parameters, two estimation methods have been used that depend on the Bayes method according to dynamic modeling: the Maximum A Posterior method (MAP) This method was done using numerical methods represented by a Iteratively Weighted Kalman Filter Smoothing (IWKFS) and in combination with the Expectation maximization algorithm (EM), the other method is represented by the Hybrid Markov Chains Monte Carlo (HMCMC) method using the Metropolis Hasting algorithm (MH) and Gypsum sampling (GS). It was concluded that survival analysis by descretization the data into a set of intervals is more flexible and fluid, as this allows analyzing risks and diagnosing impacts that vary over time. The study was applied in the survival analysis on dialysis until either death occurred due to kidney failure or the competing event, represented by kidney transplantation. The most important variables affecting the patient’s cessation of dialysis were also identified for both events in this research.
Abstract
Machining residual stresses correlate very closely with the cutting parameters and the tool geometries. This research work aims to investigate the effect of cutting speed, feed rate and depth of cut on the surface residual stress of steel AISI 1045 after face milling operation. After each milling test, the residual stress on the surface of the workpiece was measured by using X-ray diffraction technique. Design of Experiment (DOE) software was employed using the response surface methodology (RSM) technique with a central composite rotatable design to build a mathematical model to determine the relationship between the input variables and the response. The results showed that both
... Show MoreIn this work, a class of stochastically perturbed differential systems with standard Brownian motion of ordinary unperturbed differential system is considered and studied. The necessary conditions for the existence of a unique solution of the stochastic perturbed semi-linear system of differential equations are suggested and supported by concluding remarks. Some theoretical results concerning the mean square exponential stability of the nominal unperturbed deterministic differential system and its equivalent stochastically perturbed system with the deterministic and stochastic process as a random noise have been stated and proved. The proofs of the obtained results are based on using the stochastic quadratic Lyapunov function meth
... Show MoreIn this paper we show that if ? Xi is monotonically T2-space then each Xi is monotonically T2-space, too. Moreover, we show that if ? Xi is monotonically normal space then each Xi is monotonically normal space, too. Among these results we give a new proof to show that the monotonically T2-space property and monotonically normal space property are hereditary property and topologically property and give an example of T2-space but not monotonically T2-space.
Fractal image compression depends on representing an image using affine transformations. The main concern for researches in the discipline of fractal image compression (FIC) algorithm is to decrease encoding time needed to compress image data. The basic technique is that each portion of the image is similar to other portions of the same image. In this process, there are many models that were developed. The presence of fractals was initially noticed and handled using Iterated Function System (IFS); that is used for encoding images. In this paper, a review of fractal image compression is discussed with its variants along with other techniques. A summarized review of contributions is achieved to determine the fulfillment of fractal image co
... Show MoreRegression models are one of the most important models used in modern studies, especially research and health studies because of the important results they achieve. Two regression models were used: Poisson Regression Model and Conway-Max Well- Poisson), where this study aimed to make a comparison between the two models and choose the best one between them using the simulation method and at different sample sizes (n = 25,50,100) and with repetitions (r = 1000). The Matlab program was adopted.) to conduct a simulation experiment, where the results showed the superiority of the Poisson model through the mean square error criterion (MSE) and also through the Akaiki criterion (AIC) for the same distribution.
Paper type:
... Show MoreThe integral transformations is a complicated function from a function space into a simple function in transformed space. Where the function being characterized easily and manipulated through integration in transformed function space. The two parametric form of SEE transformation and its basic characteristics have been demonstrated in this study. The transformed function of a few fundamental functions along with its time derivative rule is shown. It has been demonstrated how two parametric SEE transformations can be used to solve linear differential equations. This research provides a solution to population growth rate equation. One can contrast these outcomes with different Laplace type transformations
One of the wellbore instability problems in vertical wells are breakouts in Zubair oilfield. Breakouts, if exceeds its critical limits will produce problems such as loss circulation which will add to the non-productive time (NPT) thus increasing loss in costs and in total revenues. In this paper, three of the available rock failure criteria (Mohr-Coulomb, Mogi-Coulomb and Modified-Lade) are used to study and predict the occurrence of the breakouts. It is found that there is an increase over the allowable breakout limit in breakout width in Tanuma shaly formation and it was predicted using Mohr-Coulomb criterion. An increase in the pore pressure was predicted in Tanuma shaly formation, thus; a new mud weight and casing pr
... Show MoreNowadays, the power plant is changing the power industry from a centralized and vertically integrated form into regional, competitive and functionally separate units. This is done with the future aims of increasing efficiency by better management and better employment of existing equipment and lower price of electricity to all types of customers while retaining a reliable system. This research is aimed to solve the optimal power flow (OPF) problem. The OPF is used to minimize the total generations fuel cost function. Optimal power flow may be single objective or multi objective function. In this thesis, an attempt is made to minimize the objective function with keeping the voltages magnitudes of all load buses, real outp
... Show MoreIn this paper, we deal with games of fuzzy payoffs problem while there is uncertainty in data. We use the trapezoidal membership function to transform the data into fuzzy numbers and utilize the three different ranking function algorithms. Then we compare between these three ranking algorithms by using trapezoidal fuzzy numbers for the decision maker to get the best gains
Digital forensic is part of forensic science that implicitly covers crime related to computer and other digital devices. It‟s being for a while that academic studies are interested in digital forensics. The researchers aim to find out a discipline based on scientific structures that defines a model reflecting their observations. This paper suggests a model to improve the whole investigation process and obtaining an accurate and complete evidence and adopts securing the digital evidence by cryptography algorithms presenting a reliable evidence in a court of law. This paper presents the main and basic concepts of the frameworks and models used in digital forensics investigation.