Distributed Denial of Service (DDoS) attacks on Web-based services have grown in both number and sophistication with the rise of advanced wireless technology and modern computing paradigms. Detecting these attacks in the sea of communication packets is very important. There were a lot of DDoS attacks that were directed at the network and transport layers at first. During the past few years, attackers have changed their strategies to try to get into the application layer. The application layer attacks could be more harmful and stealthier because the attack traffic and the normal traffic flows cannot be told apart. Distributed attacks are hard to fight because they can affect real computing resources as well as network bandwidth. DDoS attacks can also be made with smart devices that connect to the Internet, which can be infected and used as botnets. They use Deep Learning (D.L.) techniques like Convolutional Neural Network (C.N.N.) and variants of Recurrent Neural Networks (R.N.N.), such as Long Short-Term Memory (L.S.T.M.), Bidirectional L.S.T.M., Stacked L.S.T.M., and the Gat G.R.U.. These techniques have been used to detect (DDoS) attacks. The Portmap.csv file from the most recent DDoS dataset, CICDDoS2019, has been used to test D.L. approaches. Before giving the data to the D.L. approaches, the data is cleaned up. The pre-processed dataset is used to train and test the D.L. approaches. In the paper, we show how the D.L. approach works with multiple models and how they compare to each other.
This paper describes a new finishing process using magnetic abrasives were newly made to finish effectively brass plate that is very difficult to be polished by the conventional machining processes. Taguchi experimental design method was adopted for evaluating the effect of the process parameters on the improvement of the surface roughness and hardness by the magnetic abrasive polishing. The process parameters are: the applied current to the inductor, the working gap between the workpiece and the inductor, the rotational speed and the volume of powder. The analysis of variance(ANOVA) was analyzed using statistical software to identify the optimal conditions for better surface roughness and hardness. Regressions models based on statistical m
... Show MoreDatabase is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show MoreAbstract
Much attention has been paid for the use of robot arm in various applications. Therefore, the optimal path finding has a significant role to upgrade and guide the arm movement. The essential function of path planning is to create a path that satisfies the aims of motion including, averting obstacles collision, reducing time interval, decreasing the path traveling cost and satisfying the kinematics constraints. In this paper, the free Cartesian space map of 2-DOF arm is constructed to attain the joints variable at each point without collision. The D*algorithm and Euclidean distance are applied to obtain the exact and estimated distances to the goal respectively. The modified Particle Swarm Optimization al
... Show MoreVariable selection is an essential and necessary task in the statistical modeling field. Several studies have triedto develop and standardize the process of variable selection, but it isdifficultto do so. The first question a researcher needs to ask himself/herself what are the most significant variables that should be used to describe a given dataset’s response. In thispaper, a new method for variable selection using Gibbs sampler techniqueshas beendeveloped.First, the model is defined, and the posterior distributions for all the parameters are derived.The new variable selection methodis tested usingfour simulation datasets. The new approachiscompared with some existingtechniques: Ordinary Least Squared (OLS), Least Absolute Shrinkage
... Show MoreBreast cancer is one of the most important malignant diseases all over the world. The incidence of breast cancer is increasing around the world and it is still the leading cause of cancer mortality An Approximately 1.3 million new cases were diagnosed worldwide last year. With areas rising increasing, risk factors for breast cancer including obesity, early menarche, alcohol and smoking, environmental contamination and reduced or late birth rates become more prevalent. In Iraq, breast cancer ranks first among types of cancers diagnosed in women. This study was conducted on one hundred twenty women with breast cancer that was evaluated and investigated for the possible role of the risk factors on the development of breast cancer in females. T
... Show MoreThe problem of slow learning in primary schools’ pupils is not a local or private one. It is also not related to a certain society other than others or has any relation to a particular culture, it is rather an international problem of global nature. It is one of the well-recognized issues in education field. Additionally, it is regarded as one of the old difficulties to which ancient people gave attention. It is discovered through the process of observing human behaviour and attempting to explain and predict it.
Through the work of the two researchers via frequent visits to primary schools that include special classes for slow learning pupils, in addition to the fact that one of the researcher has a child with slow learning issue, t
The research seeks to identify the comprehensive electronic banking system and the role of the auditor in light of the customer's application of electronic systems that depend on the Internet in providing its services, as a proposed audit program has been prepared in accordance with international auditing controls and standards based on the study of the customer's environment and the analysis of external and internal risks in the light of financial and non-financial indicators, the research reached a set of conclusions, most notably, increasing the dependence of banks on the comprehensive banking system for its ability to provide new and diverse banking services, The researcher suggested several recommendations, the most important of whi
... Show MoreIn this paper, we investigate the impact of fear on a food chain mathematical model with prey refuge and harvesting. The prey species reproduces by to the law of logistic growth. The model is adapted from version of the Holling type-II prey-first predator and Lotka-Volterra for first predator-second predator model. The conditions, have been examined that assurance the existence of equilibrium points. Uniqueness and boundedness of the solution of the system have been achieve. The local and global dynamical behaviors are discussed and analyzed. In the end, numerical simulations are confirmed the theoretical results that obtained and to display the effectiveness of varying each parameter
The constructivist learning model is one of the models of constructivist theory in learning, as it generally emphasizes the active role of the learner during learning, in addition to that the intellectual and actual participation in the various activities to help students gain the skills of analyzing artistic works. The current research aims to know the effectiveness of the constructivist learning model in the acquisition of the skills of the Institute of Fine Arts for the skills of (technical work analysis). To achieve the goal, the researcher formulated the following hypothesis: There are no statistically significant differences between the average scores of the experimental group students in the skill test for analyzing artworks befor
... Show MoreThe present research was conducted to reduce the sulfur content of Iraqi heavy naphtha by adsorption using different metals oxides over Y-Zeolite. The Y-Zeolite was synthesized by a sol-gel technique. The average size of zeolite was 92.39 nm, surface area 558 m2/g, and pore volume 0.231 cm3/g. The metals of nickel, zinc, and copper were dispersed by an impregnation method to prepare Ni/HY, Zn/HY, Cu/HY, and Ni + Zn /HY catalysts for desulfurization. The adsorptive desulfurization was carried out in a batch mode at different operating conditions such as mixing time (10,15,30,60, and 600 min) and catalyst dosage (0.2,0.4,0.6,0.8,1, and 1.2 g). The most of the sulfur compounds were removed at 10 min for all catalyst ty
... Show More