In data mining, classification is a form of data analysis that can be used to extract models describing important data classes. Two of the well known algorithms used in data mining classification are Backpropagation Neural Network (BNN) and Naïve Bayesian (NB). This paper investigates the performance of these two classification methods using the Car Evaluation dataset. Two models were built for both algorithms and the results were compared. Our experimental results indicated that the BNN classifier yield higher accuracy as compared to the NB classifier but it is less efficient because it is time-consuming and difficult to analyze due to its black-box implementation.
Artificial Neural Network (ANN) model's application is widely increased for wastewater treatment plant (WWTP) variables prediction and forecasting which can enable the operators to take appropriate action and maintaining the norms. It is much easier modeling tool for dealing with complex nature WWTP modeling comparing with other traditional mathematical models. ANN technique significance has been considered at present study for the prediction of sequencing batch reactor (SBR) performance based on effluent's (BOD5/COD) ratio after collecting the required historical daily SBR data for two years operation (2015-2016) from Baghdad Mayoralty and Al-Rustamiya WWTP office, Iraq. The prediction was gotten by the application of a feed-forwa
... Show MoreMilling process is a common machining operation that is used in the manufacturing of complex surfaces. Machining-induced residual stresses (RS) have a great impact on the performance of machined components and the surface quality in face milling operations with parameter cutting. The properties of engineering material as well as structural components, specifically fatigue life, deformation, impact resistance, corrosion resistance, and brittle fracture, can all be significantly influenced by residual stresses. Accordingly, controlling the distribution of residual stresses is indeed important to protect the piece and avoid failure. Most of the previous works inspected the material properties, tool parameters, or cutting parameters, bu
... Show MoreThe time spent in drilling ahead is usually a significant portion of total well cost. Drilling is an expensive operation including the cost of equipment and material used during the penetration of rock plus crew efforts in order to finish the well without serious problems. Knowing the rate of penetration should help in speculation of the cost and lead to optimize drilling outgoings. Ten wells in the Nasiriya oil field have been selected based on the availability of the data. Dynamic elastic properties of Mishrif formation in the selected wells were determined by using Interactive Petrophysics (IP V3.5) software based on the las files and log record provided. The average rate of penetration and average dynamic elastic propert
... Show MoreIn this paper we used frequentist and Bayesian approaches for the linear regression model to predict future observations for unemployment rates in Iraq. Parameters are estimated using the ordinary least squares method and for the Bayesian approach using the Markov Chain Monte Carlo (MCMC) method. Calculations are done using the R program. The analysis showed that the linear regression model using the Bayesian approach is better and can be used as an alternative to the frequentist approach. Two criteria, the root mean square error (RMSE) and the median absolute deviation (MAD) were used to compare the performance of the estimates. The results obtained showed that the unemployment rates will continue to increase in the next two decade
... Show MoreThe field of Optical Character Recognition (OCR) is the process of converting an image of text into a machine-readable text format. The classification of Arabic manuscripts in general is part of this field. In recent years, the processing of Arabian image databases by deep learning architectures has experienced a remarkable development. However, this remains insufficient to satisfy the enormous wealth of Arabic manuscripts. In this research, a deep learning architecture is used to address the issue of classifying Arabic letters written by hand. The method based on a convolutional neural network (CNN) architecture as a self-extractor and classifier. Considering the nature of the dataset images (binary images), the contours of the alphabet
... Show MoreThe differential cross section for the Rhodium and Tantalum has been calculated by using the Cross Section Calculations (CSC) in range of energy(1keV-1MeV) . This calculations based on the programming of the Klein-Nashina and Rayleigh Equations. Atomic form factors as well as the coherent functions in Fortran90 language Machine proved very fast an accurate results and the possibility of application of such model to obtain the total coefficient for any elements or compounds.
In this paper, some Bayes estimators of the reliability function of Gompertz distribution have been derived based on generalized weighted loss function. In order to get a best understanding of the behaviour of Bayesian estimators, a non-informative prior as well as an informative prior represented by exponential distribution is considered. Monte-Carlo simulation have been employed to compare the performance of different estimates for the reliability function of Gompertz distribution based on Integrated mean squared errors. It was found that Bayes estimators with exponential prior information under the generalized weighted loss function were generally better than the estimators based o
Transmission lines are generally subjected to faults, so it is advantageous to determine these faults as quickly as possible. This study uses an Artificial Neural Network technique to locate a fault as soon as it happens on the Doukan-Erbil of 132kv double Transmission lines network. CYME 7.1-Programming/Simulink utilized simulation to model the suggested network. A multilayer perceptron feed-forward artificial neural network with a back propagation learning algorithm is used for the intelligence locator's training, testing, assessment, and validation. Voltages and currents were applied as inputs during the neural network's training. The pre-fault and post-fault values determined the scaled values. The neural network's p
... Show MoreSelf-driving automobiles are prominent in science and technology, which affect social and economic development. Deep learning (DL) is the most common area of study in artificial intelligence (AI). In recent years, deep learning-based solutions have been presented in the field of self-driving cars and have achieved outstanding results. Different studies investigated a variety of significant technologies for autonomous vehicles, including car navigation systems, path planning, environmental perception, as well as car control. End-to-end learning control directly converts sensory data into control commands in autonomous driving. This research aims to identify the most accurate pre-trained Deep Neural Network (DNN) for predicting the steerin
... Show MoreThe influx of data in bioinformatics is primarily in the form of DNA, RNA, and protein sequences. This condition places a significant burden on scientists and computers. Some genomics studies depend on clustering techniques to group similarly expressed genes into one cluster. Clustering is a type of unsupervised learning that can be used to divide unknown cluster data into clusters. The k-means and fuzzy c-means (FCM) algorithms are examples of algorithms that can be used for clustering. Consequently, clustering is a common approach that divides an input space into several homogeneous zones; it can be achieved using a variety of algorithms. This study used three models to cluster a brain tumor dataset. The first model uses FCM, whic
... Show More