With the proliferation of both Internet access and data traffic, recent breaches have brought into sharp focus the need for Network Intrusion Detection Systems (NIDS) to protect networks from more complex cyberattacks. To differentiate between normal network processes and possible attacks, Intrusion Detection Systems (IDS) often employ pattern recognition and data mining techniques. Network and host system intrusions, assaults, and policy violations can be automatically detected and classified by an Intrusion Detection System (IDS). Using Python Scikit-Learn the results of this study show that Machine Learning (ML) techniques like Decision Tree (DT), Naïve Bayes (NB), and K-Nearest Neighbor (KNN) can enhance the effectiveness of an Intrusion Detection System (IDS). Success is measured by a variety of metrics, including accuracy, precision, recall, F1-Score, and execution time. Applying feature selection approaches such as Analysis of Variance (ANOVA), Mutual Information (MI), and Chi-Square (Ch-2) reduced execution time, increased detection efficiency and accuracy, and boosted overall performance. All classifiers achieve the greatest performance with 99.99% accuracy and the shortest computation time of 0.0089 seconds while using ANOVA with 10% of features.
I've made extensive studies on the distribution of the electric field stable heterogeneous within intensive that contain metal rings with slope diagonal positive to a site halfway to be in its maximum value, followed by decline negative and equally to the other end of the concentrated distributed by electric stable thanking sequentially and have focused empirical studies in the pastthe molecules that you focused Pantqaúha during passage
The multi-focus image fusion method can fuse more than one focused image to generate a single image with more accurate description. The purpose of image fusion is to generate one image by combining information from many source images of the same scene. In this paper, a multi-focus image fusion method is proposed with a hybrid pixel level obtained in the spatial and transform domains. The proposed method is implemented on multi-focus source images in YCbCr color space. As the first step two-level stationary wavelet transform was applied on the Y channel of two source images. The fused Y channel is implemented by using many fusion rule techniques. The Cb and Cr channels of the source images are fused using principal component analysis (PCA).
... Show MoreThe Twofish cipher is a very powerful algorithm with a fairly complex structure that permeates most data parsing and switching and can be easily implemented. The keys of the Twofish algorithm are of variable length (128, 192, or 256 bits), and the key schedule is generated once and repeated in encrypting all message blocks, whatever their number, and this reduces the confidentiality of encryption. This article discusses the process of generating cipher keys for each block. This concept is new and unknown in all common block cipher algorithms. It is based on the permanent generation of sub keys for all blocks and the key generation process, each according to its work. The Geffe's Generator is used to generate subkeys to make eac
... Show MoreAbstract— The growing use of digital technologies across various sectors and daily activities has made handwriting recognition a popular research topic. Despite the continued relevance of handwriting, people still require the conversion of handwritten copies into digital versions that can be stored and shared digitally. Handwriting recognition involves the computer's strength to identify and understand legible handwriting input data from various sources, including document, photo-graphs and others. Handwriting recognition pose a complexity challenge due to the diversity in handwriting styles among different individuals especially in real time applications. In this paper, an automatic system was designed to handwriting recognition
... Show MoreThe aim of this paper, is to discuss several high performance training algorithms fall into two main categories. The first category uses heuristic techniques, which were developed from an analysis of the performance of the standard gradient descent algorithm. The second category of fast algorithms uses standard numerical optimization techniques such as: quasi-Newton . Other aim is to solve the drawbacks related with these training algorithms and propose an efficient training algorithm for FFNN
Abstract
For sparse system identification,recent suggested algorithms are -norm Least Mean Square ( -LMS), Zero-Attracting LMS (ZA-LMS), Reweighted Zero-Attracting LMS (RZA-LMS), and p-norm LMS (p-LMS) algorithms, that have modified the cost function of the conventional LMS algorithm by adding a constraint of coefficients sparsity. And so, the proposed algorithms are named -ZA-LMS,
... Show MoreDocument clustering is the process of organizing a particular electronic corpus of documents into subgroups of similar text features. Formerly, a number of conventional algorithms had been applied to perform document clustering. There are current endeavors to enhance clustering performance by employing evolutionary algorithms. Thus, such endeavors became an emerging topic gaining more attention in recent years. The aim of this paper is to present an up-to-date and self-contained review fully devoted to document clustering via evolutionary algorithms. It firstly provides a comprehensive inspection to the document clustering model revealing its various components with its related concepts. Then it shows and analyzes the principle research wor
... Show MoreThe purpose of this paper is to solve the stochastic demand for the unbalanced transport problem using heuristic algorithms to obtain the optimum solution, by minimizing the costs of transporting the gasoline product for the Oil Products Distribution Company of the Iraqi Ministry of Oil. The most important conclusions that were reached are the results prove the possibility of solving the random transportation problem when the demand is uncertain by the stochastic programming model. The most obvious finding to emerge from this work is that the genetic algorithm was able to address the problems of unbalanced transport, And the possibility of applying the model approved by the oil products distribution company in the Iraqi Ministry of Oil to m
... Show MoreToday with increase using social media, a lot of researchers have interested in topic extraction from Twitter. Twitter is an unstructured short text and messy that it is critical to find topics from tweets. While topic modeling algorithms such as Latent Semantic Analysis (LSA) and Latent Dirichlet Allocation (LDA) are originally designed to derive topics from large documents such as articles, and books. They are often less efficient when applied to short text content like Twitter. Luckily, Twitter has many features that represent the interaction between users. Tweets have rich user-generated hashtags as keywords. In this paper, we exploit the hashtags feature to improve topics learned
Genetic algorithms (GA) are a helpful instrument for planning and controlling the activities of a project. It is based on the technique of survival of the fittest and natural selection. GA has been used in different sectors of construction and building however that is rarely documented. This research aimed to examine the utilisation of genetic algorithms in construction project management. For this purpose, the research focused on the benefits and challenges of genetic algorithms, and the extent to which genetic algorithms is utilised in construction project management. Results showed that GA provides an ability of generating near optimal solutions which can be adopted to reduce complexity in project management and resolve difficult problem
... Show More