Classification of network traffic is an important topic for network management, traffic routing, safe traffic discrimination, and better service delivery. Traffic examination is the entire process of examining traffic data, from intercepting traffic data to discovering patterns, relationships, misconfigurations, and anomalies in a network. Between them, traffic classification is a sub-domain of this field, the purpose of which is to classify network traffic into predefined classes such as usual or abnormal traffic and application type. Most Internet applications encrypt data during traffic, and classifying encrypted data during traffic is not possible with traditional methods. Statistical and intelligence methods can find and model traffic patterns that can be categorized based on statistical characteristics. These methods help determine the type of traffic and protect user privacy at the same time. To classify encrypted traffic from end to end, this paper proposes using (XGboost) algorithms, finding the highest parameters using Bayesian optimization, and comparing the proposed model with machine learning algorithms (Nearest Neighbor, Logistic Regression, Decision Trees, Naive Bayes, Multilayer Neural Networks) to classify traffic from end to end. Network traffic has two classifications: whether the traffic is encrypted or not, and the target application. The research results showed the possibility of classifying dual and multiple traffic with high accuracy. The proposed model has a higher classification accuracy than the other models, and finding the optimal parameters increases the model accuracy.
Diabetes is one of the increasing chronic diseases, affecting millions of people around the earth. Diabetes diagnosis, its prediction, proper cure, and management are compulsory. Machine learning-based prediction techniques for diabetes data analysis can help in the early detection and prediction of the disease and its consequences such as hypo/hyperglycemia. In this paper, we explored the diabetes dataset collected from the medical records of one thousand Iraqi patients. We applied three classifiers, the multilayer perceptron, the KNN and the Random Forest. We involved two experiments: the first experiment used all 12 features of the dataset. The Random Forest outperforms others with 98.8% accuracy. The second experiment used only five att
... Show MoreIn current generation of technology, a robust security system is required based on biometric trait such as human gait, which is a smooth biometric feature to understand humans via their taking walks pattern. In this paper, a person is recognized based on his gait's style that is captured from a video motion previously recorded with a digital camera. The video package is handled via more than one phase after splitting it into a successive image (called frames), which are passes through a preprocessing step earlier than classification procedure operation. The pre-processing steps encompass converting each image into a gray image, cast off all undesirable components and ridding it from noise, discover differen
... Show MoreLink failure refers to the failure between two connections/nodes in a perfectly working simulation scenario at a particular instance. Transport layer routing protocols form an important basis of setting up a simulation, with Transmission Control Protocol and User Datagram Protocol being the primary of them. The research makes use of Network Simulator v2.35 to conduct different simulation experiments for link failure and provide validation results. In this paper, both protocols, TCP and UDP are compared based on the throughput of packets delivered from one node to the other constrained to the condition that for a certain interval of time the link fails and the simulation time remains the same for either of the protocols. Overall,
... Show MoreNumerous regions in the city of Baghdad experience the congestion and traffic problems. Due to the religious and economic significance, Al-Kadhimiya city (inside the metropolitan range of Baghdad) was chosen as study area. The data gathering stage was separated into two branches: the questionnaire method which is utilized to estimate the traffic volumes for the chosen roads and field data collection method which included video recording and manual counting for the volumes entering the selected signal intersections. The stage of analysis and evaluation for the seventeen urban roads, one highway, and three intersections was performed by HCS-2000 software.The presented work plots a system for assessing the level of service
... Show MoreThe investigation of signature validation is crucial to the field of personal authenticity. The biometrics-based system has been developed to support some information security features.Aperson’s signature, an essential biometric trait of a human being, can be used to verify their identification. In this study, a mechanism for automatically verifying signatures has been suggested. The offline properties of handwritten signatures are highlighted in this study which aims to verify the authenticity of handwritten signatures whether they are real or forged using computer-based machine learning techniques. The main goal of developing such systems is to verify people through the validity of their signatures. In this research, images of a group o
... Show MoreNS-2 is a tool to simulate networks and events that occur per packet sequentially based on time and are widely used in the research field. NS-2 comes with NAM (Network Animator) that produces a visual representation it also supports several simulation protocols. The network can be tested end-to-end. This test includes data transmission, delay, jitter, packet-loss ratio and throughput. The Performance Analysis simulates a virtual network and tests for transport layer protocols at the same time with variable data and analyzes simulation results based on the network simulator NS-2.
The huge amount of documents in the internet led to the rapid need of text classification (TC). TC is used to organize these text documents. In this research paper, a new model is based on Extreme Machine learning (EML) is used. The proposed model consists of many phases including: preprocessing, feature extraction, Multiple Linear Regression (MLR) and ELM. The basic idea of the proposed model is built upon the calculation of feature weights by using MLR. These feature weights with the extracted features introduced as an input to the ELM that produced weighted Extreme Learning Machine (WELM). The results showed a great competence of the proposed WELM compared to the ELM.
Due to the huge variety of 5G services, Network slicing is promising mechanism for dividing the physical network resources in to multiple logical network slices according to the requirements of each user. Highly accurate and fast traffic classification algorithm is required to ensure better Quality of Service (QoS) and effective network slicing. Fine-grained resource allocation can be realized by Software Defined Networking (SDN) with centralized controlling of network resources. However, the relevant research activities have concentrated on the deep learning systems which consume enormous computation and storage requirements of SDN controller that results in limitations of speed and accuracy of traffic classification mechanism. To fill thi
... Show MoreAn intelligent software defined network (ISDN) based on an intelligent controller can manage and control the network in a remarkable way. In this article, a methodology is proposed to estimate the packet flow at the sensing plane in the software defined network-Internet of Things based on a partial recurrent spike neural network (PRSNN) congestion controller, to predict the next step ahead of packet flow and thus, reduce the congestion that may occur. That is, the proposed model (spike ISDN-IoT) is enhanced with a congestion controller. This controller works as a proactive controller in the proposed model. In addition, we propose another intelligent clustering controller based on an artificial neural network, which operates as a reactive co
... Show More