Abstract
Suffering the human because of pressure normal life of exposure to several types of heart disease as a result of due to different factors. Therefore, and in order to find out the case of a death whether or not, are to be modeled using binary logistic regression model
In this research used, one of the most important models of nonlinear regression models extensive use in the modeling of applications statistical, in terms of heart disease which is the binary logistic regression model. and then estimating the parameters of this model using the statistical estimation methods, another problem will be appears in estimating its parameters, as well as when the numbe
... Show MoreThe increase globally fossil fuel consumption as it represents the main source of energy around the world, and the sources of heavy oil more than light, different techniques were used to reduce the viscosity and increase mobility of heavy crude oil. this study focusing on the experimental tests and modeling with Back Feed Forward Artificial Neural Network (BFF-ANN) of the dilution technique to reduce a heavy oil viscosity that was collected from the south- Iraq oil fields using organic solvents, organic diluents with different weight percentage (5, 10 and 20 wt.% ) of (n-heptane, toluene, and a mixture of different ratio
... Show MoreIn this paper, we will provide a proposed method to estimate missing values for the Explanatory variables for Non-Parametric Multiple Regression Model and compare it with the Imputation Arithmetic mean Method, The basis of the idea of this method was based on how to employ the causal relationship between the variables in finding an efficient estimate of the missing value, we rely on the use of the Kernel estimate by Nadaraya – Watson Estimator , and on Least Squared Cross Validation (LSCV) to estimate the Bandwidth, and we use the simulation study to compare between the two methods.
Some experiments need to know the extent of their usefulness to continue providing them or not. This is done through the fuzzy regression discontinuous model, where the Epanechnikov Kernel and Triangular Kernel were used to estimate the model by generating data from the Monte Carlo experiment and comparing the results obtained. It was found that the. Epanechnikov Kernel has a least mean squared error.
Empirical and statistical methodologies have been established to acquire accurate permeability identification and reservoir characterization, based on the rock type and reservoir performance. The identification of rock facies is usually done by either using core analysis to visually interpret lithofacies or indirectly based on well-log data. The use of well-log data for traditional facies prediction is characterized by uncertainties and can be time-consuming, particularly when working with large datasets. Thus, Machine Learning can be used to predict patterns more efficiently when applied to large data. Taking into account the electrofacies distribution, this work was conducted to predict permeability for the four wells, FH1, FH2, F
... Show MoreInformation from 54 Magnetic Resonance Imaging (MRI) brain tumor images (27 benign and 27 malignant) were collected and subjected to multilayer perceptron artificial neural network available on the well know software of IBM SPSS 17 (Statistical Package for the Social Sciences). After many attempts, automatic architecture was decided to be adopted in this research work. Thirteen shape and statistical characteristics of images were considered. The neural network revealed an 89.1 % of correct classification for the training sample and 100 % of correct classification for the test sample. The normalized importance of the considered characteristics showed that kurtosis accounted for 100 % which means that this variable has a substantial effect
... Show MoreIn the literature, several correlations have been proposed for bubble size prediction in bubble columns. However these correlations fail to predict bubble diameter over a wide range of conditions. Based on a data bank of around 230 measurements collected from the open literature, a correlation for bubble sizes in the homogenous region in bubble columns was derived using Artificial Neural Network (ANN) modeling. The bubble diameter was found to be a function of six parameters: gas velocity, column diameter, diameter of orifice, liquid density, liquid viscosity and liquid surface tension. Statistical analysis showed that the proposed correlation has an Average Absolute Relative Error (AARE) of 7.3 % and correlation coefficient of 92.2%. A
... Show MoreThis study uses an Artificial Neural Network (ANN) to examine the constitutive relationships of the Glass Fiber Reinforced Polymer (GFRP) residual tensile strength at elevated temperatures. The objective is to develop an effective model and establish fire performance criteria for concrete structures in fire scenarios. Multilayer networks that employ reactive error distribution approaches can determine the residual tensile strength of GFRP using six input parameters, in contrast to previous mathematical models that utilized one or two inputs while disregarding the others. Multilayered networks employing reactive error distribution technology assign weights to each variable influencing the residual tensile strength of GFRP. Temperatur
... Show MoreThe bandwidth requirements of telecommunication network users increased rapidly during the last decades. Optical access technologies must provide the bandwidth demand for each user. The passive optical access networks (PONs) support a maximum data rate of 100 Gbps by using the Orthogonal Frequency Division Multiplexing (OFDM) technique in the optical access network. In this paper, the optical broadband access networks with many techniques from Time Division Multiplexing Passive Optical Networks (TDM PON) to Orthogonal Frequency Division Multiplex Passive Optical Networks (OFDM PON) are presented. The architectures, advantages, disadvantages, and main parameters of these optical access networks are discussed and reported which have many ad
... Show More