Traffic classification is referred to as the task of categorizing traffic flows into application-aware classes such as chats, streaming, VoIP, etc. Most systems of network traffic identification are based on features. These features may be static signatures, port numbers, statistical characteristics, and so on. Current methods of data flow classification are effective, they still lack new inventive approaches to meet the needs of vital points such as real-time traffic classification, low power consumption, ), Central Processing Unit (CPU) utilization, etc. Our novel Fast Deep Packet Header Inspection (FDPHI) traffic classification proposal employs 1 Dimension Convolution Neural Network (1D-CNN) to automatically learn more representational characteristics of traffic flow types; by considering only the position of the selected bits from the packet header. The proposal a learning approach based on deep packet inspection which integrates both feature extraction and classification phases into one system. The results show that the FDPHI works very well on the applications of feature learning. Also, it presents powerful adequate traffic classification results in terms of energy consumption (70% less power CPU utilization around 48% less), and processing time (310% for IPv4 and 595% for IPv6).
Software Defined Networking (SDN) with centralized control provides a global view and achieves efficient network resources management. However, using centralized controllers has several limitations related to scalability and performance, especially with the exponential growth of 5G communication. This paper proposes a novel traffic scheduling algorithm to avoid congestion in the control plane. The Packet-In messages received from different 5G devices are classified into two classes: critical and non-critical 5G communication by adopting Dual-Spike Neural Networks (DSNN) classifier and implementing it on a Virtualized Network Function (VNF). Dual spikes identify each class to increase the reliability of the classification
... Show MoreThe present study investigates deep eutectic solvents (DESs) as potential media for enzymatic hydrolysis. A series of ternary ammonium and phosphonium-based DESs were prepared at different molar ratios by mixing with aqueous glycerol (85%). The physicochemical properties including surface tension, conductivity, density, and viscosity were measured at a temperature range of 298.15 K – 363.15 K. The eutectic points were highly influenced by the variation of temperature. The eutectic point of the choline chloride: glycerol: water (ratio of 1: 2.55: 2.28) and methyltriphenylphosphonium bromide:glycerol:water (ratio of 1: 4.25: 3.75) is 213.4 K and 255.8 K, respectively. The stability of the lipase enzyme isolated from porcine pancreas (PPL) a
... Show MoreSpatial data analysis is performed in order to remove the skewness, a measure of the asymmetry of the probablitiy distribution. It also improve the normality, a key concept of statistics from the concept of normal distribution “bell shape”, of the properties like improving the normality porosity, permeability and saturation which can be are visualized by using histograms. Three steps of spatial analysis are involved here; exploratory data analysis, variogram analysis and finally distributing the properties by using geostatistical algorithms for the properties. Mishrif Formation (unit MB1) in Nasiriya Oil Field was chosen to analyze and model the data for the first eight wells. The field is an anticline structure with northwest- south
... Show MoreIn recent years, the performance of Spatial Data Infrastructures for governments and companies is a task that has gained ample attention. Different categories of geospatial data such as digital maps, coordinates, web maps, aerial and satellite images, etc., are required to realize the geospatial data components of Spatial Data Infrastructures. In general, there are two distinct types of geospatial data sources exist over the Internet: formal and informal data sources. Despite the growth of informal geospatial data sources, the integration between different free sources is not being achieved effectively. The adoption of this task can be considered the main advantage of this research. This article addresses the research question of ho
... Show MoreIn recent years, the performance of Spatial Data Infrastructures for governments and companies is a task that has gained ample attention. Different categories of geospatial data such as digital maps, coordinates, web maps, aerial and satellite images, etc., are required to realize the geospatial data components of Spatial Data Infrastructures. In general, there are two distinct types of geospatial data sources exist over the Internet: formal and informal data sources. Despite the growth of informal geospatial data sources, the integration between different free sources is not being achieved effectively. The adoption of this task can be considered the main advantage of this research. This article addresses the research question of how the
... Show MoreIn many oil-recovery systems, relative permeabilities (kr) are essential flow factors that affect fluid dispersion and output from petroleum resources. Traditionally, taking rock samples from the reservoir and performing suitable laboratory studies is required to get these crucial reservoir properties. Despite the fact that kr is a function of fluid saturation, it is now well established that pore shape and distribution, absolute permeability, wettability, interfacial tension (IFT), and saturation history all influence kr values. These rock/fluid characteristics vary greatly from one reservoir region to the next, and it would be impossible to make kr measurements in all of them. The unsteady-state approach was used to calculate the relat
... Show MoreAbstract
The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.
... Show More
Objective: The study aimed to identify the adolescents' fast foods and snacks, and find out the relationship between fast
food, snacks and adolescents' demographic data (gender and Body Mass Index). Methodology: A descriptive study
was conducted on impact of fast foods and snacks upon adolescents' Body Mass Index in secondary schools at Baghdad
city, starting from 20
th of April 2013 to the end of October 2014. Non- probability (purposive) sample of 1254
adolescents were chosen from secondary schools of both sides of Al-Karkh and Al-Russafa sectors. Data was collected
through a specially constructed questionnaire format include (12) items multiple choice questions. The validity of the
questionnaire was determined thr
Ondansetron hydrochloride (ONH) is a very bitter, potent antiemetic drug used for the treatment and/or prophylaxis of chemotherapy or radiotherapy or postoperative induced emesis. The objective of this study is to formulate and evaluate of taste masked fast dissolving tablet (FDTs) of ONH to increase patient compliance.
ONH taste masked granules were prepared by solid dispersion technique using Eudragit E100 polymer as an inert carrier. Solvent evaporation and fusion melting methods were used for such preparation.
Completely taste masking with zero release of drug in phosphate buffer pH 6.8was obtained from granules prepared by solvent evaporation method using drug: polymer ratio of 1:2, from which four formulas pas
... Show More