Preferred Language
Articles
/
poard4YBIXToZYALcIuI
DYNAMIC MODELING OF TIME-VARYING ESTIMATION FOR DISCRETE SURVIVAL ANALYSIS FOR DIALYSIS PATIENTS IN BASRAH, IRAQ

Survival analysis is widely applied to data that described by the length of time until the occurrence of an event under interest such as death or other important events. The purpose of this paper is to use the dynamic methodology which provides a flexible method, especially in the analysis of discrete survival time, to estimate the effect of covariate variables through time in the survival analysis on dialysis patients with kidney failure until death occurs. Where the estimations process is completely based on the Bayes approach by using two estimation methods: the maximum A Posterior (MAP) involved with Iteratively Weighted Kalman Filter Smoothing (IWKFS) and in combination with the Expectation Maximization (EM) algorithm. While the other method was represented by the Hybrid Markov Chain Monte Carlo method (HMCMC). Moreover, two hazard function models were considered in the comparison: the Logistic model and the discrete Cox model. Two criteria were used for comparisons Average Mean Square Error: AMSE and Cross Entropy Error: CEE. All these four combinations of methods were clarified via the discussion of the numerical results with their explanations. It can be noticed the superiority of HMCMC method through the two hazard models.

Scopus
View Publication Preview PDF
Quick Preview PDF
Publication Date
Tue Dec 01 2020
Journal Name
Gulf Economist
The Bayesian Estimation in Competing Risks Analysis for Discrete Survival Data under Dynamic Methodology with Application to Dialysis Patients in Basra/ Iraq

Survival analysis is one of the types of data analysis that describes the time period until the occurrence of an event of interest such as death or other events of importance in determining what will happen to the phenomenon studied. There may be more than one endpoint for the event, in which case it is called Competing risks. The purpose of this research is to apply the dynamic approach in the analysis of discrete survival time in order to estimate the effect of covariates over time, as well as modeling the nonlinear relationship between the covariates and the discrete hazard function through the use of the multinomial logistic model and the multivariate Cox model. For the purpose of conducting the estimation process for both the discrete

... Show More
View Publication Preview PDF
Publication Date
Fri Jan 01 2021
Journal Name
International Journal Of Agricultural And Statistical Sciences
DYNAMIC MODELING FOR DISCRETE SURVIVAL DATA BY USING ARTIFICIAL NEURAL NETWORKS AND ITERATIVELY WEIGHTED KALMAN FILTER SMOOTHING WITH COMPARISON

Survival analysis is widely applied in data describing for the life time of item until the occurrence of an event of interest such as death or another event of understudy . The purpose of this paper is to use the dynamic approach in the deep learning neural network method, where in this method a dynamic neural network that suits the nature of discrete survival data and time varying effect. This neural network is based on the Levenberg-Marquardt (L-M) algorithm in training, and the method is called Proposed Dynamic Artificial Neural Network (PDANN). Then a comparison was made with another method that depends entirely on the Bayes methodology is called Maximum A Posterior (MAP) method. This method was carried out using numerical algorithms re

... Show More
Scopus (1)
Scopus
Preview PDF
Publication Date
Sun Dec 01 2019
Journal Name
Journal Of Economics And Administrative Sciences
Use Generalized Pareto Survival Models to Estimation Optimal Survival Time for Myocardial Infarction Patients

The survival analysis is one of the modern methods of analysis that is based on the fact that the dependent variable represents time until the event concerned in the study. There are many survival models that deal with the impact of explanatory factors on the likelihood of survival, including the models proposed by the world, David Cox, one of the most important and common models of survival, where it consists of two functions, one of which is a parametric function that does not depend on the survival time and the other a nonparametric function that depends on times of survival, which the Cox model is defined as a semi parametric model, The set of parametric models that depend on the time-to-event distribution parameters such as

... Show More
Crossref
View Publication Preview PDF
Publication Date
Tue Apr 20 2021
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Parametric Models in Survival Analysis for Lung Cancer Patients

    The aim of this study is to estimate the survival function for the data of lung cancer patients, using parametric methods (Weibull, Gumbel, exponential and log-logistic).

Comparisons between the proposed estimation method have been performed using statistical indicator Akaike information Criterion, Akaike information criterion corrected and Bayesian information Criterion, concluding that the survival function for the lung cancer by using Gumbel distribution model is the best. The expected values of the survival function of all estimation methods that are proposed in this study have been decreasing gradually with increasing failure times for lung cancer patients, which means that there is an opposite relationshi

... Show More
Crossref
View Publication Preview PDF
Publication Date
Fri Jan 01 2021
Journal Name
Int. J. Nonlinear Anal. Appl.
Analysis of a harvested discrete-time biological models

This work aims to analyze a three-dimensional discrete-time biological system, a prey-predator model with a constant harvesting amount. The stage structure lies in the predator species. This analysis is done by finding all possible equilibria and investigating their stability. In order to get an optimal harvesting strategy, we suppose that harvesting is to be a non-constant rate. Finally, numerical simulations are given to confirm the outcome of mathematical analysis.

Scopus (6)
Scopus
Publication Date
Thu Jun 30 2022
Journal Name
Journal Of Economics And Administrative Sciences
Estimation of Time of Survival Rate by Using Clayton Function for the Exponential Distribution with Practical Application

Each phenomenon contains several variables. Studying these variables, we find mathematical formula to get the joint distribution and the copula that are a useful and good tool to find the amount of correlation, where the survival function was used to measure the relationship of age with the level of cretonne in the remaining blood of the person. The Spss program was also used to extract the influencing variables from a group of variables using factor analysis and then using the Clayton copula function that is used to find the shared binary distributions using multivariate distributions, where the bivariate distribution was calculated, and then the survival function value was calculated for a sample size (50) drawn from Yarmouk Ho

... Show More
Crossref
View Publication Preview PDF
Publication Date
Fri Mar 01 2013
Journal Name
Journal Of Economics And Administrative Sciences
Robust Two-Step Estimation and Approximation Local Polynomial Kernel For Time-Varying Coefficient Model With Balance Longitudinal Data

      In this research, the nonparametric technique has been presented to estimate the time-varying coefficients functions for the longitudinal balanced data that characterized by observations obtained through (n) from the independent subjects, each one of them is measured repeatedly by group of  specific time points (m). Although the measurements are independent among the different subjects; they are mostly connected within each subject and the applied techniques is the Local Linear kernel LLPK technique. To avoid the problems of dimensionality, and thick computation, the two-steps method has been used to estimate the coefficients functions by using the two former technique. Since, the two-

... Show More
Crossref
View Publication Preview PDF
Publication Date
Fri Jul 21 2023
Journal Name
Journal Of Engineering
Spectral Technique for Baud Time Estimation

A new approach for baud time (or baud rate) estimation of a random binary signal is presented. This approach utilizes the spectrum of the signal after nonlinear processing in a way that the estimation error can be reduced by simply increasing the number of the processed samples instead of increasing the sampling rate. The spectrum of the new signal is shown to give an accurate estimate about the baud time when there is no apriory information or any restricting preassumptions. The performance of the estimator for random binary square waves perturbed by white Gaussian noise and ISI is evaluated and compared with that of the conventional estimator of the zero crossing detector.

Crossref
View Publication Preview PDF
Publication Date
Sun Jan 01 2006
Journal Name
Journal Of Engineering
SELF ORGANIZING FUZZY CONTROLLER FOR A NON-LINEAR TIME VARYING SYSTEM

This paper proposes a self organizing fuzzy controller as an enhancement level of the fuzzy controller. The adjustment mechanism provides explicit adaptation to tune and update the position of the output membership functions of the fuzzy controller. Simulation results show that this controller is capable of controlling a non-linear time varying system so that the performance of the system improves so as to reach the desired state in a less number of samples.

Publication Date
Wed Mar 27 2019
Journal Name
Iraqi Journal Of Science
Fuzzy Survival and Hazard Functions Estimation for Rayleigh distribution

In this article, performing and deriving the probability density function for Rayleigh distribution by using maximum likelihood estimator method and moment estimator method, then crating the crisp survival function and crisp hazard function to find the interval estimation for scale parameter by using a linear trapezoidal membership function. A new proposed procedure used to find the fuzzy numbers for the parameter by utilizing (     to find a fuzzy numbers for scale parameter of Rayleigh distribution. applying two algorithms by using ranking functions to make the fuzzy numbers as crisp numbers. Then computed the survival functions and hazard functions by utilizing the real data application.

View Publication Preview PDF