Survival analysis is widely applied in data describing for the life time of item until the occurrence of an event of interest such as death or another event of understudy . The purpose of this paper is to use the dynamic approach in the deep learning neural network method, where in this method a dynamic neural network that suits the nature of discrete survival data and time varying effect. This neural network is based on the Levenberg-Marquardt (L-M) algorithm in training, and the method is called Proposed Dynamic Artificial Neural Network (PDANN). Then a comparison was made with another method that depends entirely on the Bayes methodology is called Maximum A Posterior (MAP) method. This method was carried out using numerical algorithms represented by Iteratively Weighted Kalman Filter Smoothing (IWKFS) algorithm and in combination with the Expectation Maximization (EM) algorithm. Average Mean Square Error (AMSE) and Cross Entropy Error (CEE) were used as comparison’s criteria. The methods and procedures were applied to data generated by simulation using a different combination of sample sizes and the number of intervals.
beef and chicken meat were used to get Sarcoplasim, the chicken Sarcoplasim were used to prepare antibody for it after injected in rabbit, the antiserums activity were 1/32 by determined with Immune double diffusion test, the self test refer to abele for some antiserums to detected with beef sarcoplasim, which it mean found same proteins be between beef and chicken meat, which it refer to difficult depended on this immune method to detect for cheat of chicken meat with beef, so the antibody for beef sarcoplasim were removed from serum by immune absorption step to produce specific serum against chicken sarcoplasim that it used in Immune double diffusion test to qualitative detect for cheat beef with 5% chicken meat or more at least, and the
... Show MoreNuclear structure of 29-34Mg isotopes toward neutron dripline have been investigated using shell model with Skyrme-Hartree–Fock calculations. In particular nuclear densities for proton, neutron, mass and charge densities with their corresponding rms radii, neutron skin thicknesses and inelastic electron scattering form factors are calculated for positive low-lying states. The deduced results are discussed for the transverse form factor and compared with the available experimental data. It has been confirmed that the combining shell model with Hartree-Fock mean field method with Skyrme interaction can accommodate very well the nuclear excitation properties and can reach a highly descriptive and predictive power when investiga
... Show MoreBack ground: Several devices with different physical bases have been developed for the clinical measurement of corneal thickness, they classified into 4 categories: Scheimpflug photography based, Slit –Scanning topography, optical coherence tomography (OCT) based and ultrasound (US) based.Objective:To evaluatethe precision of the new Scheimpflug –Placido disc corneal topography in measurement of corneal thickness and to compare the measured values with that obtained by US pachymetry.Methods: Setting of this study is Lasik center in Eye Specialty Private Hospital. Baghdad. Iraq.Eyes of healthy subjects were examined with the Sirius topography.3 consecutive measurements of central (CCT)and thinnest (TCT) corneal thicknesses were obtain
... Show MoreBaghdad and the other Iraqis provinces have witnessed many of celebrations which have the significant effect on the souls of Arabic and Islamic people in general , and Iraqi people, especially the birthday and death of two al-kadhimen Imams(peace upon them) and others .From here the researcher begin to study the visiting of imam kadhimen (peace upon him) on 25 Rajab the commemoration of his sacrifice, simply because have implications of religious, ideological and cultural sectors which represents in finding the greatest flow of visitors .the problem of research appeared due to the clear difference in number of visitors during one day, beside the significant increase in number of visitors throu
... Show MoreThe control charts are one of the scientific technical statistics tools that will be used to control of production and always contained from three lines central line and upper, lower lines to control quality of production and represents set of numbers so finally the operating productivity under control or nor than depending on the actual observations. Some times to calculating the control charts are not accurate and not confirming, therefore the Fuzzy Control Charts are using instead of Process Control Charts so this method is more sensitive, accurate and economically for assisting decision maker to control the operation system as early time. In this project will be used set data fr
... Show MoreThis work proposes a new video buffer framework (VBF) to acquire a favorable quality of experience (QoE) for video streaming in cellular networks. The proposed framework consists of three main parts: client selection algorithm, categorization method, and distribution mechanism. The client selection algorithm was named independent client selection algorithm (ICSA), which is proposed to select the best clients who have less interfering effects on video quality and recognize the clients’ urgency based on buffer occupancy level. In the categorization method, each frame in the video buffer is given a specific number for better estimation of the playout outage probability, so it can efficiently handle so many frames from different video
... Show MoreEnergy efficiency is a significant aspect in designing robust routing protocols for wireless sensor networks (WSNs). A reliable routing protocol has to be energy efficient and adaptive to the network size. To achieve high energy conservation and data aggregation, there are two major techniques, clusters and chains. In clustering technique, sensor networks are often divided into non-overlapping subsets called clusters. In chain technique, sensor nodes will be connected with the closest two neighbors, starting with the farthest node from the base station till the closest node to the base station. Each technique has its own advantages and disadvantages which motivate some researchers to come up with a hybrid routing algorit
... Show MoreIn recent years, the migration of the computational workload to computational clouds has attracted intruders to target and exploit cloud networks internally and externally. The investigation of such hazardous network attacks in the cloud network requires comprehensive network forensics methods (NFM) to identify the source of the attack. However, cloud computing lacks NFM to identify the network attacks that affect various cloud resources by disseminating through cloud networks. In this paper, the study is motivated by the need to find the applicability of current (C-NFMs) for cloud networks of the cloud computing. The applicability is evaluated based on strengths, weaknesses, opportunities, and threats (SWOT) to outlook the cloud network. T
... Show MoreThe auditory system can suffer from exposure to loud noise and human health can be affected. Traffic noise is a primary contributor to noise pollution. To measure the noise levels, 3 variables were examined at 25 locations. It was found that the main factors that determine the increase in noise level are traffic volume, vehicle speed, and road functional class. The data have been taken during three different periods per day so that they represent and cover the traffic noise of the city during heavy traffic flow conditions. Analysis of traffic noise prediction was conducted using a simple linear regression model to accurately predict the equivalent continuous sound level. The difference between the predicted and the measured noise shows that
... Show More