Secure data communication across networks is always threatened with intrusion and abuse. Network Intrusion Detection System (IDS) is a valuable tool for in-depth defense of computer networks. Most research and applications in the field of intrusion detection systems was built based on analysing the several datasets that contain the attacks types using the classification of batch learning machine. The present study presents the intrusion detection system based on Data Stream Classification. Several data stream algorithms were applied on CICIDS2017 datasets which contain several new types of attacks. The results were evaluated to choose the best algorithm that satisfies high accuracy and low computation time.
A new type of the connected domination parameters called tadpole domination number of a graph is introduced. Tadpole domination number for some standard graphs is determined, and some bounds for this number are obtained. Additionally, a new graph, finite, simple, undirected and connected, is introduced named weaver graph. Tadpole domination is calculated for this graph with other families of graphs.
The Climatic parameters for the years (1985-2015) were collected from Baghdad
meteorological station and then were applied to evaluate the climatic conditions for
the Al-Yusufyiah area south Baghdad. The total annual rainfall is (119.65 mm),
while the total annual evaporation is (3201.7 mm), relative humidity is (43.62%),
sunshine (8.76 h/day), temperature (23.28 C◦) and wind speed (3.06 m/sec). Climate
of the study area is described as an arid according to classification of (Kettaneh and
Gangopadhyaya, 1974), (Mather, 1973), and (Al-Kubaisi, 2004). Mean monthly
water surplus for the period (1985-2015) was recorded in the study area about (4.7
mm) in November, (11.67 mm) in December, (20.56 mm) in January and (6
The aim of this article is to introduce a new definition of domination number in graphs called hn-domination number denoted by . This paper presents some properties which show the concepts of connected and independent hn-domination. Furthermore, some bounds of these parameters are determined, specifically, the impact on hn-domination parameter is studied thoroughly in this paper when a graph is modified by deleting or adding a vertex or deleting an edge.
The estimation of the regular regression model requires several assumptions to be satisfied such as "linearity". One problem occurs by partitioning the regression curve into two (or more) parts and then joining them by threshold point(s). This situation is regarded as a linearity violation of regression. Therefore, the multiphase regression model is received increasing attention as an alternative approach which describes the changing of the behavior of the phenomenon through threshold point estimation. Maximum likelihood estimator "MLE" has been used in both model and threshold point estimations. However, MLE is not resistant against violations such as outliers' existence or in case of the heavy-tailed error distribution. The main goal of t
... Show MoreOne of the costliest problems facing the production of hydrocarbons in unconsolidated sandstone reservoirs is the production of sand once hydrocarbon production starts. The sanding start prediction model is very important to decide on sand control in the future, including whether or when sand control should be used. This research developed an easy-to-use Computer program to determine the beginning of sanding sites in the driven area. The model is based on estimating the critical pressure drop that occurs when sand is onset to produced. The outcomes have been drawn as a function of the free sand production with the critical flow rates for reservoir pressure decline. The results show that the pressure drawdown required to
... Show MoreIn this paper we present the theoretical foundation of forward error analysis of numerical algorithms under;• Approximations in "built-in" functions.• Rounding errors in arithmetic floating-point operations.• Perturbations of data.The error analysis is based on linearization method. The fundamental tools of the forward error analysis are system of linear absolute and relative a prior and a posteriori error equations and associated condition numbers constituting optimal of possible cumulative round – off errors. The condition numbers enable simple general, quantitative bounds definitions of numerical stability. The theoretical results have been applied a Gaussian elimination, and have proved to be very effective means of both a prior
... Show MoreTwitter popularity has increasingly grown in the last few years, influencing life’s social, political, and business aspects. People would leave their tweets on social media about an event, and simultaneously inquire to see other people's experiences and whether they had a positive/negative opinion about that event. Sentiment Analysis can be used to obtain this categorization. Product reviews, events, and other topics from all users that comprise unstructured text comments are gathered and categorized as good, harmful, or neutral using sentiment analysis. Such issues are called polarity classifications. This study aims to use Twitter data about OK cuisine reviews obtained from the Amazon website and compare the effectiveness
... Show MoreDue to severe scouring, many bridges failed worldwide. Therefore, the safety of the existing bridge (after contrition) mainly depends on the continuous monitoring of local scour at the substructure. However, the bridge's safety before construction mainly depends on the consideration of local scour estimation at the bridge substructure. Estimating the local scour at the bridge piers is usually done using the available formulae. Almost all the formulae used in estimating local scour at the bridge piers were derived from laboratory data. It is essential to test the performance of proposed local scour formulae using field data. In this study, the performance of selected bridge scours estimation formulae was validated and sta
... Show MoreDue to the increased of information existing on the World Wide Web (WWW), the subject of how to extract new and useful knowledge from the log file has gained big interest among researchers in data mining and knowledge discovery topics.
Web miming, which is a subset of data mining divided into three particular ways, web content mining, web structure mining, web usage mining. This paper is interested in server log file, which is belonging to the third category (web usage mining). This file will be analyzed according to the suggested algorithm to extract the behavior of the user. Knowing the behavior is coming from knowing the complete path which is taken from the specific user.
Extracting these types of knowledge required many of KDD
The goals of endodontic preparation were to shape and clean the space of the root canal and remove microorganisms, affected dentin and pulp, the apical foramen and the canal curve should be protected from being transported during endodontic canal preparation. The aim of this study was to evaluate the curve straightening of curved root canals and apical transportation after preparation with four rotary systems. Forty mesial roots of the lower 1st molars teeth only the mesiobuccal canals were used, these roots were immersed into cold clear acrylic , the teeth roots divided into four groups according to rotary system used for preparation of the canals (ten roots for each group):. group I: ProTaper Next rotary system, group II: IRaCe Plus rotar
... Show More