Survival analysis is widely applied in data describing for the life time of item until the occurrence of an event of interest such as death or another event of understudy . The purpose of this paper is to use the dynamic approach in the deep learning neural network method, where in this method a dynamic neural network that suits the nature of discrete survival data and time varying effect. This neural network is based on the Levenberg-Marquardt (L-M) algorithm in training, and the method is called Proposed Dynamic Artificial Neural Network (PDANN). Then a comparison was made with another method that depends entirely on the Bayes methodology is called Maximum A Posterior (MAP) method. This method was carried out using numerical algorithms represented by Iteratively Weighted Kalman Filter Smoothing (IWKFS) algorithm and in combination with the Expectation Maximization (EM) algorithm. Average Mean Square Error (AMSE) and Cross Entropy Error (CEE) were used as comparison’s criteria. The methods and procedures were applied to data generated by simulation using a different combination of sample sizes and the number of intervals.
Astronomers have known since the invention of the telescope that atmospheric turbulence affects celestial images. So, in order to compensate for the atmospheric aberrations of the observed wavefront, an Adaptive Optics (AO) system has been introduced. The AO can be arranged into two systems: closedloop and open-loop systems. The aim of this paper is to model and compare the performance of both AO loop systems by using one of the most recent Adaptive Optics simulation tools, the Objected-Oriented Matlab Adaptive Optics (OOMAO). Then assess the performance of closed and open loop systems by their capabilities to compensate for wavefront aberrations and improve image quality, also their effect by the observed optical bands (near-infrared band
... Show MoreSequence covering array (SCA) generation is an active research area in recent years. Unlike the sequence-less covering arrays (CA), the order of sequence varies in the test case generation process. This paper reviews the state-of-the-art of the SCA strategies, earlier works reported that finding a minimal size of a test suite is considered as an NP-Hard problem. In addition, most of the existing strategies for SCA generation have a high order of complexity due to the generation of all combinatorial interactions by adopting one-test-at-a-time fashion. Reducing the complexity by adopting one-parameter- at-a-time for SCA generation is a challenging process. In addition, this reduction facilitates the supporting for a higher strength of
... Show MoreSequence covering array (SCA) generation is an active research area in recent years. Unlike the sequence-less covering arrays (CA), the order of sequence varies in the test case generation process. This paper reviews the state-of-the-art of the SCA strategies, earlier works reported that finding a minimal size of a test suite is considered as an NP-Hard problem. In addition, most of the existing strategies for SCA generation have a high order of complexity due to the generation of all combinatorial interactions by adopting one-test-at-a-time fashion. Reducing the complexity by adopting one-parameter- at-a-time for SCA generation is a challenging process. In addition, this reduction facilitates the supporting for a higher strength of cove
... Show MoreIn this article, performing and deriving the probability density function for Rayleigh distribution by using maximum likelihood estimator method and moment estimator method, then crating the crisp survival function and crisp hazard function to find the interval estimation for scale parameter by using a linear trapezoidal membership function. A new proposed procedure used to find the fuzzy numbers for the parameter by utilizing ( to find a fuzzy numbers for scale parameter of Rayleigh distribution. applying two algorithms by using ranking functions to make the fuzzy numbers as crisp numbers. Then computed the survival functions and hazard functions by utilizing the real data application.
The analysis of survival and reliability considered of topics and methods of vital statistics at the present time because of their importance in the various demographical, medical, industrial and engineering fields. This research focused generate random data for samples from the probability distribution Generalized Gamma: GG, known as: "Inverse Transformation" Method: ITM, which includes the distribution cycle integration function incomplete Gamma integration making it more difficult classical estimation so will be the need to illustration to the method of numerical approximation and then appreciation of the function of survival function. It was estimated survival function by simulation the way "Monte Carlo". The Entropy method used for the
... Show MoreEnhancement of heat transfer in the tube heat exchanger is studied experimentally by using discrete twisted tapes. Three different positions were selected for inserting turbulators along tube section (horizontal position by α= 00, inclined position by α= 45 0 and vertical position by α= 900). The space between turbulators was fixed by distributing 5 pieces of these turbulators with pitch ratio PR = (0.44). Also, the factor of constant heat flux was applied as a boundary condition around the tube test section for all experiments of this investigation, while the flow rates were selected as a variable factor (Reynolds number values vary from 5000 to 15000). The results s
... Show MoreThe objective of the study is to demonstrate the predictive ability is better between the logistic regression model and Linear Discriminant function using the original data first and then the Home vehicles to reduce the dimensions of the variables for data and socio-economic survey of the family to the province of Baghdad in 2012 and included a sample of 615 observation with 13 variable, 12 of them is an explanatory variable and the depended variable is number of workers and the unemployed.
Was conducted to compare the two methods above and it became clear by comparing the logistic regression model best of a Linear Discriminant function written
... Show MoreUnder-reamed piles defined by having one or more bulbs have the potential for sizeable major sides over conventional straight-sided piles, most of the studies on under-reamed piles have been conducted on the experimental side, while theoretical studies, such as the finite element method, have been mainly confined to conventional straight-sided piles. On the other hand, although several laboratory and experimental studies have been conducted to study the behavior of under-reamed piles, few numerical studies have been carried out to simulate the piles' performance. In addition, there is no research to compare and evaluate the behavior of these piles under dynamic loading. Therefore, this study aimed to numerically investigate bearing capaci
... Show More