Traffic classification is referred to as the task of categorizing traffic flows into application-aware classes such as chats, streaming, VoIP, etc. Most systems of network traffic identification are based on features. These features may be static signatures, port numbers, statistical characteristics, and so on. Current methods of data flow classification are effective, they still lack new inventive approaches to meet the needs of vital points such as real-time traffic classification, low power consumption, ), Central Processing Unit (CPU) utilization, etc. Our novel Fast Deep Packet Header Inspection (FDPHI) traffic classification proposal employs 1 Dimension Convolution Neural Network (1D-CNN) to automatically learn more representational characteristics of traffic flow types; by considering only the position of the selected bits from the packet header. The proposal a learning approach based on deep packet inspection which integrates both feature extraction and classification phases into one system. The results show that the FDPHI works very well on the applications of feature learning. Also, it presents powerful adequate traffic classification results in terms of energy consumption (70% less power CPU utilization around 48% less), and processing time (310% for IPv4 and 595% for IPv6).
Introduction: Methadone hydrochloride (MDN) is an effective pharmacological substitution treatment for opioids dependence, adopted in different countries as methadone maintenance treatment (MMT) programmes. However, MDN can exacerbate the addiction problem if it is abused and injected intravenously, and the frequent visits to the MMT centres can reduce patient compliance. The overall aim of this study is to develop a novel extended-release capsule of MDN using the sol-gel silica (SGS) technique that has the potential to counteract medication-tampering techniques and associated health risks and reduce the frequent visits to MMT centres. Methods: For MDN recrystallisation, a closed container method (CCM) and hot-stage method (HSM) were conduc
... Show MoreInterface evaluation has been the subject of extensive study and research in human-computer interaction (HCI). It is a crucial tool for promoting the idea that user engagement with computers should resemble casual conversations and interactions between individuals, according to specialists in the field. Researchers in the HCI field initially focused on making various computer interfaces more usable, thus improving the user experience. This study's objectives were to evaluate and enhance the user interface of the University of Baghdad's implementation of an online academic management system using the effectiveness, time-based efficiency, and satisfaction rates that comply with the task questionnaire process. We made a variety of interfaces f
... Show More
XML is being incorporated into the foundation of E-business data applications. This paper addresses the problem of the freeform information that stored in any organization and how XML with using this new approach will make the operation of the search very efficient and time consuming. This paper introduces new solution and methodology that has been developed to capture and manage such unstructured freeform information (multi information) depending on the use of XML schema technologies, neural network idea and object oriented relational database, in order to provide a practical solution for efficiently management multi freeform information system.
Semantic segmentation realization and understanding is a stringent task not just for computer vision but also in the researches of the sciences of earth, semantic segmentation decompose compound architectures in one elements, the most mutual object in a civil outside or inside senses must classified then reinforced with information meaning of all object, it’s a method for labeling and clustering point cloud automatically. Three dimensions natural scenes classification need a point cloud dataset to representation data format as input, many challenge appeared with working of 3d data like: little number, resolution and accurate of three Dimensional dataset . Deep learning now is the po
This investigation presents an experimental and analytical study on the behavior of reinforced concrete deep beams before and after repair. The original beams were first loaded under two points load up to failure, then, repaired by epoxy resin and tested again. Three of the test beams contains shear reinforcement and the other two beams have no shear reinforcement. The main variable in these beams was the percentage of longitudinal steel reinforcement (0, 0.707, 1.061, and 1.414%). The main objective of this research is to investigate the possibility of restoring the full load carrying capacity of the reinforced concrete deep beam with and without shear reinforcement by using epoxy resin as the material of repair. All be
... Show MoreTwitter data analysis is an emerging field of research that utilizes data collected from Twitter to address many issues such as disaster response, sentiment analysis, and demographic studies. The success of data analysis relies on collecting accurate and representative data of the studied group or phenomena to get the best results. Various twitter analysis applications rely on collecting the locations of the users sending the tweets, but this information is not always available. There are several attempts at estimating location based aspects of a tweet. However, there is a lack of attempts on investigating the data collection methods that are focused on location. In this paper, we investigate the two methods for obtaining location-based dat
... Show MoreThe intelligent buildings provided various incentives to get highly inefficient energy-saving caused by the non-stationary building environments. In the presence of such dynamic excitation with higher levels of nonlinearity and coupling effect of temperature and humidity, the HVAC system transitions from underdamped to overdamped indoor conditions. This led to the promotion of highly inefficient energy use and fluctuating indoor thermal comfort. To address these concerns, this study develops a novel framework based on deep clustering of lagrangian trajectories for multi-task learning (DCLTML) and adding a pre-cooling coil in the air handling unit (AHU) to alleviate a coupling issue. The proposed DCLTML exhibits great overall control and is
... Show More