With the proliferation of both Internet access and data traffic, recent breaches have brought into sharp focus the need for Network Intrusion Detection Systems (NIDS) to protect networks from more complex cyberattacks. To differentiate between normal network processes and possible attacks, Intrusion Detection Systems (IDS) often employ pattern recognition and data mining techniques. Network and host system intrusions, assaults, and policy violations can be automatically detected and classified by an Intrusion Detection System (IDS). Using Python Scikit-Learn the results of this study show that Machine Learning (ML) techniques like Decision Tree (DT), Naïve Bayes (NB), and K-Nearest Neighbor (KNN) can enhance the effectiveness of an Intrusion Detection System (IDS). Success is measured by a variety of metrics, including accuracy, precision, recall, F1-Score, and execution time. Applying feature selection approaches such as Analysis of Variance (ANOVA), Mutual Information (MI), and Chi-Square (Ch-2) reduced execution time, increased detection efficiency and accuracy, and boosted overall performance. All classifiers achieve the greatest performance with 99.99% accuracy and the shortest computation time of 0.0089 seconds while using ANOVA with 10% of features.
Soil compaction is one of the most harmful elements affecting soil structure, limiting plant growth and agricultural productivity. It is crucial to assess the degree of soil penetration resistance to discover solutions to the harmful consequences of compaction. In order to obtain the appropriate value, using soil cone penetration requires time and labor-intensive measurements. Currently, satellite technologies, electronic measurement control systems, and computer software help to measure soil penetration resistance quickly and easily within the precision agriculture applications approach. The quantitative relationships between soil properties and the factors affecting their diversity contribute to digital soil mapping. Digital soil maps use
... Show More<span>One of the main difficulties facing the certified documents documentary archiving system is checking the stamps system, but, that stamps may be contains complex background and surrounded by unwanted data. Therefore, the main objective of this paper is to isolate background and to remove noise that may be surrounded stamp. Our proposed method comprises of four phases, firstly, we apply k-means algorithm for clustering stamp image into a number of clusters and merged them using ISODATA algorithm. Secondly, we compute mean and standard deviation for each remaining cluster to isolate background cluster from stamp cluster. Thirdly, a region growing algorithm is applied to segment the image and then choosing the connected regi
... Show MoreWe propose a system to detect human faces in color images type BMP by using two methods RGB and YCbCr to determine which is the best one to be used, also determine the effect of applying Low pass filter, Contrast and Brightness on the image. In face detection we try to find the forehead from the binary image by scanning of the image that starts in the middle of the image then precedes by finding the continuous white pixel after continuous black pixel and the maximum width of the white pixel by scanning left and right vertically(sampled w) if the new width is half the previous one the scanning stops.
This study was carried out for direct detection of typhi and some of its multidrug resistance genes(tem,capt,gyrA&sul2)which encode for resistance to (Ampicillin, Chloramphenicol,Ciprofioxacin,Co-trimoxazole)by using Polymerase Chain Reaction technique .(71)blood samples for people suffering from typhoid fever symptoms depending on the clinical examination and (25)for control were collected. The results investigation for flic gene which encode for flagellin protein indicated that only (19)with percentage of (26,76%)gave appositive results while all control had a negative ones. Investigation for antibiotic resistance drug in samples which show positive results for flic gene showed that there is a multidrug for all antibiotics with (94.7
... Show MoreThe preparation of the phenanthridine derivative compound was achieved by adopting an efficient one-pot synthetic approach. The condensation of an ethanolic mixture of benzaldehyde, cyclohexanone and ammonium acetate in a 2:1:1 mole ratio resulted in the formation of the title compound. Analytical and spectroscopic techniques were used to confirm the nature of the new compound. A mechanism for the formation of the phenanthridine moiety that is based on three steps has been suggested
Symmetric cryptography forms the backbone of secure data communication and storage by relying on the strength and randomness of cryptographic keys. This increases complexity, enhances cryptographic systems' overall robustness, and is immune to various attacks. The present work proposes a hybrid model based on the Latin square matrix (LSM) and subtractive random number generator (SRNG) algorithms for producing random keys. The hybrid model enhances the security of the cipher key against different attacks and increases the degree of diffusion. Different key lengths can also be generated based on the algorithm without compromising security. It comprises two phases. The first phase generates a seed value that depends on producing a rand
... Show MoreIn this paper, a new method of selection variables is presented to select some essential variables from large datasets. The new model is a modified version of the Elastic Net model. The modified Elastic Net variable selection model has been summarized in an algorithm. It is applied for Leukemia dataset that has 3051 variables (genes) and 72 samples. In reality, working with this kind of dataset is not accessible due to its large size. The modified model is compared to some standard variable selection methods. Perfect classification is achieved by applying the modified Elastic Net model because it has the best performance. All the calculations that have been done for this paper are in
Administrative procedures in various organizations produce numerous crucial records and data. These
records and data are also used in other processes like customer relationship management and accounting
operations.It is incredibly challenging to use and extract valuable and meaningful information from these data
and records because they are frequently enormous and continuously growing in size and complexity.Data
mining is the act of sorting through large data sets to find patterns and relationships that might aid in the data
analysis process of resolving business issues. Using data mining techniques, enterprises can forecast future
trends and make better business decisions.The Apriori algorithm has bee