Distributed Denial of Service (DDoS) attacks on Web-based services have grown in both number and sophistication with the rise of advanced wireless technology and modern computing paradigms. Detecting these attacks in the sea of communication packets is very important. There were a lot of DDoS attacks that were directed at the network and transport layers at first. During the past few years, attackers have changed their strategies to try to get into the application layer. The application layer attacks could be more harmful and stealthier because the attack traffic and the normal traffic flows cannot be told apart. Distributed attacks are hard to fight because they can affect real computing resources as well as network bandwidth. DDoS attacks can also be made with smart devices that connect to the Internet, which can be infected and used as botnets. They use Deep Learning (D.L.) techniques like Convolutional Neural Network (C.N.N.) and variants of Recurrent Neural Networks (R.N.N.), such as Long Short-Term Memory (L.S.T.M.), Bidirectional L.S.T.M., Stacked L.S.T.M., and the Gat G.R.U.. These techniques have been used to detect (DDoS) attacks. The Portmap.csv file from the most recent DDoS dataset, CICDDoS2019, has been used to test D.L. approaches. Before giving the data to the D.L. approaches, the data is cleaned up. The pre-processed dataset is used to train and test the D.L. approaches. In the paper, we show how the D.L. approach works with multiple models and how they compare to each other.
Objectives: This study aimed to evaluate and compare the effect of plasma treatment versus conventional treatment on the micro shear bond strength (μSBS), surface roughness, and wettability of three different CAD/CAM materials. Materials and methods: Sixty cylindrical specimens (5 mm diameter ×3 mm height) were prepared from three different CAD/CAM materials: Group A: Zirconia, Group B: Lithium disilicate, and Group C: Resin nano-ceramic. Each group was subdivided into two subgroups according to surface treatment used: Subgroup I: Conventional treatment, zirconia was sandblasted with Al2O3, while lithium disilicate and resin nano-ceramic were etched with hydrofluoric acid. Subgroup II: Plasma treatment, the surface of each material was tr
... Show MoreThe aim of this research is to demonstrate the nature of the interactive relationship between the dimensions of the requirements of economic intelligence Represented by(Administrative and regulatory requirements, human requirements, and technical requirements) The strategic success of banks is represented by (Customer satisfaction, customer confidence, quality of service, growth) In three of the Iraqi banks own bank(Middle East Iraqi Investment, Al Ahli Iraqi, Gulf Commercial), The questionnaire was adopted as a tool for collecting data and information Of the sample (85) Who are they(Director of the Commissioner, M. Director Plenipotentiary, Director of Department, Director of Section, M. Section Manager, Division Officer, Unit Officer),
... Show MoreThe subject of the Internet of Things is very important, especially at present, which is why it has attracted the attention of researchers and scientists due to its importance in human life. Through it, a person can do several things easily, accurately, and in an organized manner. The research addressed important topics, the most important of which are the concept of the Internet of Things, the history of its emergence and development, the reasons for its interest and importance, and its most prominent advantages and characteristics. The research sheds light on the structure of the Internet of Things, its structural components, and its most important components. The research dealt with the most important search engines in the Intern
... Show MoreForecasting is one of the important topics in the analysis of time series, as the importance of forecasting in the economic field has emerged in order to achieve economic growth. Therefore, accurate forecasting of time series is one of the most important challenges that we seek to make the best decision, the aim of the research is to suggest employing hybrid models to predict daily crude oil prices. The hybrid model consists of integrating the linear component, which represents Box Jenkins models, and the non-linear component, which represents one of the methods of artificial intelligence, which is the artificial neural network (ANN), support vector regression (SVR) algorithm and it was shown that the proposed hybrid models in the predicti
... Show MoreThe study aims to demonstrate the importance of instructional methods in teaching Arabic language as a second language or teaching the Arabic language to non-native speakers. The study is in line with the tremendous development in the field of knowledge, especially in the field of technology and communication, and the emergence of many electronic media in education in general and language teaching in particular. It employs an image in teaching vocabulary and presenting the experience of the Arabic Language Institute for Non-Speakers-King Abdul-Aziz University. The study follows the descriptive approach to solve the problem represented by the lack of interest in the educational methods when teaching Arabic as a second language. Accordingl
... Show MoreCompressing the speech reduces the data storage requirements, leading to reducing the time of transmitting the digitized speech over long-haul links like internet. To obtain best performance in speech compression, wavelet transforms require filters that combine a number of desirable properties, such as orthogonality and symmetry.The MCT bases functions are derived from GHM bases function using 2D linear convolution .The fast computation algorithm methods introduced here added desirable features to the current transform. We further assess the performance of the MCT in speech compression application. This paper discusses the effect of using DWT and MCT (one and two dimension) on speech compression. DWT and MCT performances in terms of comp
... Show MoreThe penalized least square method is a popular method to deal with high dimensional data ,where the number of explanatory variables is large than the sample size . The properties of penalized least square method are given high prediction accuracy and making estimation and variables selection
At once. The penalized least square method gives a sparse model ,that meaning a model with small variables so that can be interpreted easily .The penalized least square is not robust ,that means very sensitive to the presence of outlying observation , to deal with this problem, we can used a robust loss function to get the robust penalized least square method ,and get robust penalized estimator and
... Show More
