In recent years, the world witnessed a rapid growth in attacks on the internet which resulted in deficiencies in networks performances. The growth was in both quantity and versatility of the attacks. To cope with this, new detection techniques are required especially the ones that use Artificial Intelligence techniques such as machine learning based intrusion detection and prevention systems. Many machine learning models are used to deal with intrusion detection and each has its own pros and cons and this is where this paper falls in, performance analysis of different Machine Learning Models for Intrusion Detection Systems based on supervised machine learning algorithms. Using Python Scikit-Learn library KNN, Support Vector Machine, Naïve Bayes, Decision Tree, Random Forest, Stochastic Gradient Descent, Gradient Boosting and Ada Boosting classifiers were designed. Performance-wise analysis using Confusion Matrix metric carried out and comparisons between the classifiers were a due. As a case study Information Gain, Pearson and F-test feature selection techniques were used and the obtained results compared to models that use all the features. One unique outcome is that the Random Forest classifier achieves the best performance with an accuracy of 99.96% and an error margin of 0.038%, which supersedes other classifiers. Using 80% reduction in features and parameters extraction from the packet header rather than the workload, a big performance advantage is achieved, especially in online environments.
Nowadays, cloud computing has attracted the attention of large companies due to its high potential, flexibility, and profitability in providing multi-sources of hardware and software to serve the connected users. Given the scale of modern data centers and the dynamic nature of their resource provisioning, we need effective scheduling techniques to manage these resources while satisfying both the cloud providers and cloud users goals. Task scheduling in cloud computing is considered as NP-hard problem which cannot be easily solved by classical optimization methods. Thus, both heuristic and meta-heuristic techniques have been utilized to provide optimal or near-optimal solutions within an acceptable time frame for such problems. In th
... Show MoreLung cancer is the most common dangerous disease that, if treated late, can lead to death. It is more likely to be treated if successfully discovered at an early stage before it worsens. Distinguishing the size, shape, and location of lymphatic nodes can identify the spread of the disease around these nodes. Thus, identifying lung cancer at the early stage is remarkably helpful for doctors. Lung cancer can be diagnosed successfully by expert doctors; however, their limited experience may lead to misdiagnosis and cause medical issues in patients. In the line of computer-assisted systems, many methods and strategies can be used to predict the cancer malignancy level that plays a significant role to provide precise abnormality detectio
... Show MoreThe investigation of machine learning techniques for addressing missing well-log data has garnered considerable interest recently, especially as the oil and gas sector pursues novel approaches to improve data interpretation and reservoir characterization. Conversely, for wells that have been in operation for several years, conventional measurement techniques frequently encounter challenges related to availability, including the lack of well-log data, cost considerations, and precision issues. This study's objective is to enhance reservoir characterization by automating well-log creation using machine-learning techniques. Among the methods are multi-resolution graph-based clustering and the similarity threshold method. By using cutti
... Show MoreOne of the diseases on a global scale that causes the main reasons of death is lung cancer. It is considered one of the most lethal diseases in life. Early detection and diagnosis are essential for lung cancer and will provide effective therapy and achieve better outcomes for patients; in recent years, algorithms of Deep Learning have demonstrated crucial promise for their use in medical imaging analysis, especially in lung cancer identification. This paper includes a comparison between a number of different Deep Learning techniques-based models using Computed Tomograph image datasets with traditional Convolution Neural Networks and SequeezeNet models using X-ray data for the automated diagnosis of lung cancer. Although the simple details p
... Show MoreWith the vast usage of network services, Security became an important issue for all network types. Various techniques emerged to grant network security; among them is Network Intrusion Detection System (NIDS). Many extant NIDSs actively work against various intrusions, but there are still a number of performance issues including high false alarm rates, and numerous undetected attacks. To keep up with these attacks, some of the academic researchers turned towards machine learning (ML) techniques to create software that automatically predict intrusive and abnormal traffic, another approach is to utilize ML algorithms in enhancing Traditional NIDSs which is a more feasible solution since they are widely spread. To upgrade t
... Show MoreChurning of employees from organizations is a serious problem. Turnover or churn of employees within an organization needs to be solved since it has negative impact on the organization. Manual detection of employee churn is quite difficult, so machine learning (ML) algorithms have been frequently used for employee churn detection as well as employee categorization according to turnover. Using Machine learning, only one study looks into the categorization of employees up to date. A novel multi-criterion decision-making approach (MCDM) coupled with DE-PARETO principle has been proposed to categorize employees. This is referred to as SNEC scheme. An AHP-TOPSIS DE-PARETO PRINCIPLE model (AHPTOPDE) has been designed that uses 2-stage MCDM s
... Show MoreIn this paper different channel coding and interleaving schemes in DS/CDMA system over multipath fading channel were used. Two types of serially concatenated coding were presented. The first one composed of Reed-Solomon as outer code, convolutional code as inner code and the interleaver between the outer and inner codes and the second consist of convolutional code as outer code, interleaved in the middle and differential code as an inner code. Bit error rate performance of different schemes in multipath fading channel was analyzed and compared. Rack receiver was used in DS/CDMA receiver to combine multipath components in order to enhance the signal to noise ratio at the receiver.
The current study presents the simulative study and evaluation of MANET mobility models over UDP traffic pattern to determine the effects of this traffic pattern on mobility models in MANET which is implemented in NS-2.35 according to various performance metri (Throughput, AED (Average End-2-end Delay), drop packets, NRL (Normalize Routing Load) and PDF (Packet Delivery Fraction)) with various parameters such as different velocities, different environment areas, different number of nodes, different traffic rates, different traffic sources, different pause times and different simulation times . A routing protocol.…was exploited AODV(Adhoc On demand Distance Vector) and RWP (Random Waypoint), GMM (Gauss Markov Model), RPGM (Refere
... Show MoreThis work aims to analyze a three-dimensional discrete-time biological system, a prey-predator model with a constant harvesting amount. The stage structure lies in the predator species. This analysis is done by finding all possible equilibria and investigating their stability. In order to get an optimal harvesting strategy, we suppose that harvesting is to be a non-constant rate. Finally, numerical simulations are given to confirm the outcome of mathematical analysis.