Traffic classification is referred to as the task of categorizing traffic flows into application-aware classes such as chats, streaming, VoIP, etc. Most systems of network traffic identification are based on features. These features may be static signatures, port numbers, statistical characteristics, and so on. Current methods of data flow classification are effective, they still lack new inventive approaches to meet the needs of vital points such as real-time traffic classification, low power consumption, ), Central Processing Unit (CPU) utilization, etc. Our novel Fast Deep Packet Header Inspection (FDPHI) traffic classification proposal employs 1 Dimension Convolution Neural Network (1D-CNN) to automatically learn more representational characteristics of traffic flow types; by considering only the position of the selected bits from the packet header. The proposal a learning approach based on deep packet inspection which integrates both feature extraction and classification phases into one system. The results show that the FDPHI works very well on the applications of feature learning. Also, it presents powerful adequate traffic classification results in terms of energy consumption (70% less power CPU utilization around 48% less), and processing time (310% for IPv4 and 595% for IPv6).
A Laced Reinforced Concrete (LRC) structural element comprises continuously inclined shear reinforcement in the form of lacing that connects the longitudinal reinforcements on both faces of the structural element. This study conducted a theoretical investigation of LRC deep beams to predict their behavior after exposure to fire and high temperatures. Four simply supported reinforced concrete beams of 1500 mm, 200 mm, and 240 mm length, width, and depth, respectively, were considered. The specimens were identical in terms of compressive strength ( 40 MPa) and steel reinforcement details. The same laced steel reinforcement ratio of 0.0035 was used. Three specimens were burned at variable durations and steady-state temperatures (one
... Show More<span lang="EN-US">The need for robotics systems has become an urgent necessity in various fields, especially in video surveillance and live broadcasting systems. The main goal of this work is to design and implement a rover robotic monitoring system based on raspberry pi 4 model B to control this overall system and display a live video by using a webcam (USB camera) as well as using you only look once algorithm-version five (YOLOv5) to detect, recognize and display objects in real-time. This deep learning algorithm is highly accurate and fast and is implemented by Python, OpenCV, PyTorch codes and the Context Object Detection Task (COCO) 2020 dataset. This robot can move in all directions and in different places especially in
... Show MoreIn the last two decades, networks had been changed according to the rapid changing in its requirements. The current Data Center Networks have large number of hosts (tens or thousands) with special needs of bandwidth as the cloud network and the multimedia content computing is increased. The conventional Data Center Networks (DCNs) are highlighted by the increased number of users and bandwidth requirements which in turn have many implementation limitations. The current networking devices with its control and forwarding planes coupling result in network architectures are not suitable for dynamic computing and storage needs. Software Defined networking (SDN) is introduced to change this notion of traditional networks by decoupling control and
... Show MoreThe Internet is providing vital communications between millions of individuals. It is also more and more utilized as one of the commerce tools; thus, security is of high importance for securing communications and protecting vital information. Cryptography algorithms are essential in the field of security. Brute force attacks are the major Data Encryption Standard attacks. This is the main reason that warranted the need to use the improved structure of the Data Encryption Standard algorithm. This paper proposes a new, improved structure for Data Encryption Standard to make it secure and immune to attacks. The improved structure of Data Encryption Standard was accomplished using standard Data Encryption Standard with a new way of two key gene
... Show MoreMerging biometrics with cryptography has become more familiar and a great scientific field was born for researchers. Biometrics adds distinctive property to the security systems, due biometrics is unique and individual features for every person. In this study, a new method is presented for ciphering data based on fingerprint features. This research is done by addressing plaintext message based on positions of extracted minutiae from fingerprint into a generated random text file regardless the size of data. The proposed method can be explained in three scenarios. In the first scenario the message was used inside random text directly at positions of minutiae in the second scenario the message was encrypted with a choosen word before ciphering
... Show MoreThe importance of this research lies in shedding light on the concept of techno-strategy for information management from vital and important topics that showed response for change in all areas of life. As this necessitates the updating and changing of it in order to achieve its strategic goals and enhance its technological advantage. The research problem looked at the role of the information technology system (ITS) in enhancing risk management in general directorates for sports and school activity from the viewpoint of its department heads. The research aimed at the relationship of information techno-strategy in risk management and the ratios of the contribution of information techno-strategy in risk management from the viewpoint of heads o
... Show MoreThe Compressional-wave (Vp) data are useful for reservoir exploration, drilling operations, stimulation, hydraulic fracturing employment, and development plans for a specific reservoir. Due to the different nature and behavior of the influencing parameters, more complex nonlinearity exists for Vp modeling purposes. In this study, a statistical relationship between compressional wave velocity and petrophysical parameters was developed from wireline log data for Jeribe formation in Fauqi oil field south Est Iraq, which is studied using single and multiple linear regressions. The model concentrated on predicting compressional wave velocity from petrophysical parameters and any pair of shear waves velocity, porosity, density, and
... Show MoreThe Compressional-wave (Vp) data are useful for reservoir exploration, drilling operations, stimulation, hydraulic fracturing employment, and development plans for a specific reservoir. Due to the different nature and behavior of the influencing parameters, more complex nonlinearity exists for Vp modeling purposes. In this study, a statistical relationship between compressional wave velocity and petrophysical parameters was developed from wireline log data for Jeribe formation in Fauqi oil field south Est Iraq, which is studied using single and multiple linear regressions. The model concentrated on predicting compressional wave velocity from petrophysical parameters and any pair of shear waves velocity, porosity, density, a
... Show MoreBackground: Deep vein thrombosis is a multi causal disease and its one of most common venous disorder, but only one quarter of the patients who have signs and symptoms of a clot in the vein actually have thrombosis and need treatment .The disease can be difficult to diagnose. Venous ultrasound in combination with clinical finding is accurate for venous thromboembolism, its costly because a large number of patients with suspicious signs and symptoms. Venography still the gold standard for venous thromboembolism but it is invasive. The D-dimer increasingly is being seen as valuable tool rolling out venous thromboembolism and sparing low risk patients for further workup.Objectives: this study has designed the role of D-dimer to confirm diag
... Show MoreMany authors investigated the problem of the early visibility of the new crescent moon after the conjunction and proposed many criteria addressing this issue in the literature. This article presented a proposed criterion for early crescent moon sighting based on a deep-learned pattern recognizer artificial neural network (ANN) performance. Moon sight datasets were collected from various sources and used to learn the ANN. The new criterion relied on the crescent width and the arc of vision from the edge of the crescent bright limb. The result of that criterion was a control value indicating the moon's visibility condition, which separated the datasets into four regions: invisible, telescope only, probably visible, and certai
... Show More