Survival analysis is widely applied in data describing for the life time of item until the occurrence of an event of interest such as death or another event of understudy . The purpose of this paper is to use the dynamic approach in the deep learning neural network method, where in this method a dynamic neural network that suits the nature of discrete survival data and time varying effect. This neural network is based on the Levenberg-Marquardt (L-M) algorithm in training, and the method is called Proposed Dynamic Artificial Neural Network (PDANN). Then a comparison was made with another method that depends entirely on the Bayes methodology is called Maximum A Posterior (MAP) method. This method was carried out using numerical algorithms represented by Iteratively Weighted Kalman Filter Smoothing (IWKFS) algorithm and in combination with the Expectation Maximization (EM) algorithm. Average Mean Square Error (AMSE) and Cross Entropy Error (CEE) were used as comparison’s criteria. The methods and procedures were applied to data generated by simulation using a different combination of sample sizes and the number of intervals.
The problem motivation of this work deals with how to control the network overhead and reduce the network latency that may cause many unwanted loops resulting from using standard routing. This work proposes three different wireless routing protocols which they are originally using some advantages for famous wireless ad-hoc routing protocols such as dynamic source routing (DSR), optimized link state routing (OLSR), destination sequenced distance vector (DSDV) and zone routing protocol (ZRP). The first proposed routing protocol is presented an enhanced destination sequenced distance vector (E-DSDV) routing protocol, while the second proposed routing protocol is designed based on using the advantages of DSDV and ZRP and we named it as
... Show MoreThis study sought to investigate the impacts of big data, artificial intelligence (AI), and business intelligence (BI) on Firms' e-learning and business performance at Jordanian telecommunications industry. After the samples were checked, a total of 269 were collected. All of the information gathered throughout the investigation was analyzed using the PLS software. The results show a network of interconnections can improve both e-learning and corporate effectiveness. This research concluded that the integration of big data, AI, and BI has a positive impact on e-learning infrastructure development and organizational efficiency. The findings indicate that big data has a positive and direct impact on business performance, including Big
... Show MoreWellbore instability problems cause nonproductive time, especially during drilling operations in the shale formations. These problems include stuck pipe, caving, lost circulation, and the tight hole, requiring more time to treat and therefore additional costs. The extensive hole collapse problem is considered one of the main challenges experienced when drilling in the Zubair shale formation. In turn, it is caused by nonproductive time and increasing well drilling expenditure. In this study, geomechanical modeling was used to determine a suitable mud weight window to overpass these problems and improve drilling performance for well development. Three failure criteria, including Mohr–Coulomb, modifie
In this work, a weighted H lder function that approximates a Jacobi polynomial which solves the second order singular Sturm-Liouville equation is discussed. This is generally equivalent to the Jacobean translations and the moduli of smoothness. This paper aims to focus on improving methods of approximation and finding the upper and lower estimates for the degree of approximation in weighted H lder spaces by modifying the modulus of continuity and smoothness. Moreover, some properties for the moduli of smoothness with direct and inverse results are considered.
In this study, we focused on the random coefficient estimation of the general regression and Swamy models of panel data. By using this type of data, the data give a better chance of obtaining a better method and better indicators. Entropy's methods have been used to estimate random coefficients for the general regression and Swamy of the panel data which were presented in two ways: the first represents the maximum dual Entropy and the second is general maximum Entropy in which a comparison between them have been done by using simulation to choose the optimal methods.
The results have been compared by using mean squares error and mean absolute percentage error to different cases in term of correlation valu
... Show MoreThe problem of Bi-level programming is to reduce or maximize the function of the target by having another target function within the constraints. This problem has received a great deal of attention in the programming community due to the proliferation of applications and the use of evolutionary algorithms in addressing this kind of problem. Two non-linear bi-level programming methods are used in this paper. The goal is to achieve the optimal solution through the simulation method using the Monte Carlo method using different small and large sample sizes. The research reached the Branch Bound algorithm was preferred in solving the problem of non-linear two-level programming this is because the results were better.
This paper study two stratified quantile regression models of the marginal and the conditional varieties. We estimate the quantile functions of these models by using two nonparametric methods of smoothing spline (B-spline) and kernel regression (Nadaraya-Watson). The estimates can be obtained by solve nonparametric quantile regression problem which means minimizing the quantile regression objective functions and using the approach of varying coefficient models. The main goal is discussing the comparison between the estimators of the two nonparametric methods and adopting the best one between them
This study aims to analyze the spatial distribution of the epidemic spread and the role of the physical, social, and economic characteristics in this spreading. A geographically weighted regression (GWR) model was built within a GIS environment using infection data monitored by the Iraqi Ministry of Health records for 10 months from March to December 2020. The factors adopted in this model are the size of urban interaction areas and human gatherings, movement level and accessibility, and the volume of public services and facilities that attract people. The results show that it would be possible to deal with each administrative unit in proportion to its circumstances in light of the factors that appe
The evolution of the Internet of things (IoT) led to connect billions of heterogeneous physical devices together to improve the quality of human life by collecting data from their environment. However, there is a need to store huge data in big storage and high computational capabilities. Cloud computing can be used to store big data. The data of IoT devices is transferred using two types of protocols: Message Queuing Telemetry Transport (MQTT) and Hypertext Transfer Protocol (HTTP). This paper aims to make a high performance and more reliable system through efficient use of resources. Thus, load balancing in cloud computing is used to dynamically distribute the workload across nodes to avoid overloading any individual r
... Show MoreImage Fusion Using A Convolutional Neural Network