Malicious software (malware) performs a malicious function that compromising a computer system’s security. Many methods have been developed to improve the security of the computer system resources, among them the use of firewall, encryption, and Intrusion Detection System (IDS). IDS can detect newly unrecognized attack attempt and raising an early alarm to inform the system about this suspicious intrusion attempt. This paper proposed a hybrid IDS for detection intrusion, especially malware, with considering network packet and host features. The hybrid IDS designed using Data Mining (DM) classification methods that for its ability to detect new, previously unseen intrusions accurately and automatically. It uses both anomaly and misuse detection techniques using two DM classifiers (Interactive Dichotomizer 3 (ID3) classifier and Naïve Bayesian (NB) Classifier) to verify the validity of the proposed system in term of accuracy rate. A proposed HybD dataset used in training and testing the hybrid IDS. Feature selection is used to consider the intrinsic features in classification decision, this accomplished by using three different measures: Association rules (AR) method, ReliefF measure, and Gain Ratio (GR) measure. NB classifier with AR method given the most accurate classification results (99%) with false positive (FP) rate (0%) and false negative (FN) rate (1%).
This paper argues the accuracy of behavior based detection systems, in which the Application Programming Interfaces (API) calls are analyzed and monitored. The work identifies the problems that affecting the accuracy of such detection models. The work was extracted (4744) API call through analyzing. The new approach provides an accurate discriminator and can reveal malicious API in PE malware up to 83.2%. Results of this work evaluated with Discriminant Analysis
Internet paths sharing the same congested link can be identified using several shared congestion detection techniques. The new detection technique which is proposed in this paper depends on the previous novel technique (delay correlation with wavelet denoising (DCW) with new denoising method called Discrete Multiwavelet Transform (DMWT) as signal denoising to separate between queuing delay caused by network congestion and delay caused by various other delay variations. The new detection technique provides faster convergence (3 to 5 seconds less than previous novel technique) while using fewer probe packets approximately half numbers than the previous novel technique, so it will reduce the overload on the network caused by probe packets.
... Show MoreIn this paper, we used four classification methods to classify objects and compareamong these methods, these are K Nearest Neighbor's (KNN), Stochastic Gradient Descentlearning (SGD), Logistic Regression Algorithm(LR), and Multi-Layer Perceptron (MLP). Weused MCOCO dataset for classification and detection the objects, these dataset image wererandomly divided into training and testing datasets at a ratio of 7:3, respectively. In randomlyselect training and testing dataset images, converted the color images to the gray level, thenenhancement these gray images using the histogram equalization method, resize (20 x 20) fordataset image. Principal component analysis (PCA) was used for feature extraction, andfinally apply four classification metho
... Show MoreBusiness organizations have faced many challenges in recent times, most important of which is information technology, because it is widely spread and easy to use. Its use has led to an increase in the amount of data that business organizations deal with an unprecedented manner. The amount of data available through the internet is a problem that many parties seek to find solutions for. Why is it available there in this huge amount randomly? Many expectations have revealed that in 2017, there will be devices connected to the internet estimated at three times the population of the Earth, and in 2015 more than one and a half billion gigabytes of data was transferred every minute globally. Thus, the so-called data mining emerged as a
... Show MoreText categorization refers to the process of grouping text or documents into classes or categories according to their content. Text categorization process consists of three phases which are: preprocessing, feature extraction and classification. In comparison to the English language, just few studies have been done to categorize and classify the Arabic language. For a variety of applications, such as text classification and clustering, Arabic text representation is a difficult task because Arabic language is noted for its richness, diversity, and complicated morphology. This paper presents a comprehensive analysis and a comparison for researchers in the last five years based on the dataset, year, algorithms and the accuracy th
... Show MoreLandSat Satellite ETM+ image have been analyzed to detect the different depths of regions inside the Tigris river in order to detect the regions that need to remove sedimentation in Baghdad in Iraq Country. The scene consisted of six bands (without the thermal band), It was captured in March ٢٠٠١. The variance in depth is determined by applying the rationing technique on the bands ٣ and ٥. GIS ٩. ١ program is used to apply the rationing technique and determined the results.
Crime is a threat to any nation’s security administration and jurisdiction. Therefore, crime analysis becomes increasingly important because it assigns the time and place based on the collected spatial and temporal data. However, old techniques, such as paperwork, investigative judges, and statistical analysis, are not efficient enough to predict the accurate time and location where the crime had taken place. But when machine learning and data mining methods were deployed in crime analysis, crime analysis and predication accuracy increased dramatically. In this study, various types of criminal analysis and prediction using several machine learning and data mining techniques, based o