In data mining, classification is a form of data analysis that can be used to extract models describing important data classes. Two of the well known algorithms used in data mining classification are Backpropagation Neural Network (BNN) and Naïve Bayesian (NB). This paper investigates the performance of these two classification methods using the Car Evaluation dataset. Two models were built for both algorithms and the results were compared. Our experimental results indicated that the BNN classifier yield higher accuracy as compared to the NB classifier but it is less efficient because it is time-consuming and difficult to analyze due to its black-box implementation.
In this review paper, several research studies were surveyed to assist future researchers to identify available techniques in the field of infectious disease modeling across complex networks. Infectious disease modelling is becoming increasingly important because of the microbes and viruses that threaten people’s lives and societies in all respects. It has long been a focus of research in many domains, including mathematical biology, physics, computer science, engineering, economics, and the social sciences, to properly represent and analyze spreading processes. This survey first presents a brief overview of previous literature and some graphs and equations to clarify the modeling in complex networks, the detection of societie
... Show MoreThe reaction of LAs-Cl8 : [ (2,2- (1-(3,4-bis(carboxylicdichloromethoxy)-5-oxo-2,5- dihydrofuran-2-yl)ethane – 1,2-diyl)bis(2,2-dichloroacetic acid)]with sodium azide in ethanol with drops of distilled water has been investigated . The new product L-AZ :(3Z ,5Z,8Z)-2- azido-8-[azido(3Z,5Z)-2-azido-2,6-bis(azidocarbonyl)-8,9-dihydro-2H-1,7-dioxa-3,4,5- triazonine-9-yl]methyl]-9-[(1-azido-1-hydroxy)methyl]-2H-1,7-dioxa-3,4,5-triazonine – 2,6 – dicarbonylazide was isolated and characterized by elemental analysis (C.H.N) , 1H-NMR , Mass spectrum and Fourier transform infrared spectrophotometer (FT-IR) . The reaction of the L-AZ withM+n: [ ( VO(II) , Cr(III) ,Mn(II) , Co(II) , Ni(II) , Cu(II) , Zn(II) , Cd(II) and Hg(II)] has been i
... Show MoreOsteoarthritis (OA) is recognized as a main public health difficult. It is one of the major reasons of reduced function that diminishes quality of life worldwide. Osteoarthritis is a very common disorder affecting the joint cartilage. As there is no cure for osteoarthritis, treatments currently focus on management of symptoms. Pain relief, improved joint function, and joint stability are the main goals of therapy. The muscle weakness and muscle atrophy contribute to the disease process. So, rehabilitation and physiotherapy were often prescribed with the intention to alleviate pain and increase mobility. Medical therapy provides modest benefits in pain reduction and functional improvement; however, non-steroidal anti-inflammatory dru
... Show MoreIn this paper two main stages for image classification has been presented. Training stage consists of collecting images of interest, and apply BOVW on these images (features extraction and description using SIFT, and vocabulary generation), while testing stage classifies a new unlabeled image using nearest neighbor classification method for features descriptor. Supervised bag of visual words gives good result that are present clearly in the experimental part where unlabeled images are classified although small number of images are used in the training process.
This research aims to choose the appropriate probability distribution to the reliability analysis for an item through collected data for operating and stoppage time of the case study.
Appropriate choice for .probability distribution is when the data look to be on or close the form fitting line for probability plot and test the data for goodness of fit .
Minitab’s 17 software was used for this purpose after arranging collected data and setting it in the the program.
&nb
... Show MoreIn this paper we present the theoretical foundation of forward error analysis of numerical algorithms under;• Approximations in "built-in" functions.• Rounding errors in arithmetic floating-point operations.• Perturbations of data.The error analysis is based on linearization method. The fundamental tools of the forward error analysis are system of linear absolute and relative a prior and a posteriori error equations and associated condition numbers constituting optimal of possible cumulative round – off errors. The condition numbers enable simple general, quantitative bounds definitions of numerical stability. The theoretical results have been applied a Gaussian elimination, and have proved to be very effective means of both a prior
... Show MoreThe estimation of the regular regression model requires several assumptions to be satisfied such as "linearity". One problem occurs by partitioning the regression curve into two (or more) parts and then joining them by threshold point(s). This situation is regarded as a linearity violation of regression. Therefore, the multiphase regression model is received increasing attention as an alternative approach which describes the changing of the behavior of the phenomenon through threshold point estimation. Maximum likelihood estimator "MLE" has been used in both model and threshold point estimations. However, MLE is not resistant against violations such as outliers' existence or in case of the heavy-tailed error distribution. The main goal of t
... Show MoreOne of the costliest problems facing the production of hydrocarbons in unconsolidated sandstone reservoirs is the production of sand once hydrocarbon production starts. The sanding start prediction model is very important to decide on sand control in the future, including whether or when sand control should be used. This research developed an easy-to-use Computer program to determine the beginning of sanding sites in the driven area. The model is based on estimating the critical pressure drop that occurs when sand is onset to produced. The outcomes have been drawn as a function of the free sand production with the critical flow rates for reservoir pressure decline. The results show that the pressure drawdown required to
... Show MoreSteganography is a technique to hide a secret message within a different multimedia carrier so that the secret message cannot be identified. The goals of steganography techniques include improvements in imperceptibility, information hiding, capacity, security, and robustness. In spite of numerous secure methodologies that have been introduced, there are ongoing attempts to develop these techniques to make them more secure and robust. This paper introduces a color image steganographic method based on a secret map, namely 3-D cat. The proposed method aims to embed data using a secure structure of chaotic steganography, ensuring better security. Rather than using the complete image for data hiding, the selection of
... Show MoreWith the high usage of computers and networks in the current time, the amount of security threats is increased. The study of intrusion detection systems (IDS) has received much attention throughout the computer science field. The main objective of this study is to examine the existing literature on various approaches for Intrusion Detection. This paper presents an overview of different intrusion detection systems and a detailed analysis of multiple techniques for these systems, including their advantages and disadvantages. These techniques include artificial neural networks, bio-inspired computing, evolutionary techniques, machine learning, and pattern recognition.