Traffic classification is referred to as the task of categorizing traffic flows into application-aware classes such as chats, streaming, VoIP, etc. Most systems of network traffic identification are based on features. These features may be static signatures, port numbers, statistical characteristics, and so on. Current methods of data flow classification are effective, they still lack new inventive approaches to meet the needs of vital points such as real-time traffic classification, low power consumption, ), Central Processing Unit (CPU) utilization, etc. Our novel Fast Deep Packet Header Inspection (FDPHI) traffic classification proposal employs 1 Dimension Convolution Neural Network (1D-CNN) to automatically learn more representational characteristics of traffic flow types; by considering only the position of the selected bits from the packet header. The proposal a learning approach based on deep packet inspection which integrates both feature extraction and classification phases into one system. The results show that the FDPHI works very well on the applications of feature learning. Also, it presents powerful adequate traffic classification results in terms of energy consumption (70% less power CPU utilization around 48% less), and processing time (310% for IPv4 and 595% for IPv6).
Due to the huge variety of 5G services, Network slicing is promising mechanism for dividing the physical network resources in to multiple logical network slices according to the requirements of each user. Highly accurate and fast traffic classification algorithm is required to ensure better Quality of Service (QoS) and effective network slicing. Fine-grained resource allocation can be realized by Software Defined Networking (SDN) with centralized controlling of network resources. However, the relevant research activities have concentrated on the deep learning systems which consume enormous computation and storage requirements of SDN controller that results in limitations of speed and accuracy of traffic classification mechanism. To fill thi
... Show MoreThis paper describes the use of microcomputer as a laboratory instrument system. The system is focused on three weather variables measurement, are temperature, wind speed, and wind direction. This instrument is a type of data acquisition system; in this paper we deal with the design and implementation of data acquisition system based on personal computer (Pentium) using Industry Standard Architecture (ISA)bus. The design of this system involves mainly a hardware implementation, and the software programs that are used for testing, measuring and control. The system can be used to display the required information that can be transferred and processed from the external field to the system. A visual basic language with Microsoft foundation cl
... Show MoreTo track scientific developments and achievements, for example, that (achieved) after the Second World War until this moment, make each of us in absolute amazement. He invented the computer, discovered the genetic factor (DNA), and discovered the drawing of the human genetic map, going up to the moon, penetrating outer space by satellites, getting close to distant planets, producing jet planes, microprocessors, and lasers, in addition to enabling a person to create a layer of The material is extremely thin and extremely imaginative. It has also become possible for a person to "dig lines that do not exceed 20 billion meters of thickness." The human being was also able to collect things an atom and build an efficient and high-precision con
... Show MoreDeep Learning Techniques For Skull Stripping of Brain MR Images
One of the diseases on a global scale that causes the main reasons of death is lung cancer. It is considered one of the most lethal diseases in life. Early detection and diagnosis are essential for lung cancer and will provide effective therapy and achieve better outcomes for patients; in recent years, algorithms of Deep Learning have demonstrated crucial promise for their use in medical imaging analysis, especially in lung cancer identification. This paper includes a comparison between a number of different Deep Learning techniques-based models using Computed Tomograph image datasets with traditional Convolution Neural Networks and SequeezeNet models using X-ray data for the automated diagnosis of lung cancer. Although the simple details p
... Show MoreThe convolutional neural networks (CNN) are among the most utilized neural networks in various applications, including deep learning. In recent years, the continuing extension of CNN into increasingly complicated domains has made its training process more difficult. Thus, researchers adopted optimized hybrid algorithms to address this problem. In this work, a novel chaotic black hole algorithm-based approach was created for the training of CNN to optimize its performance via avoidance of entrapment in the local minima. The logistic chaotic map was used to initialize the population instead of using the uniform distribution. The proposed training algorithm was developed based on a specific benchmark problem for optical character recog
... Show More