Software-defined networks (SDN) have a centralized control architecture that makes them a tempting target for cyber attackers. One of the major threats is distributed denial of service (DDoS) attacks. It aims to exhaust network resources to make its services unavailable to legitimate users. DDoS attack detection based on machine learning algorithms is considered one of the most used techniques in SDN security. In this paper, four machine learning techniques (Random Forest, K-nearest neighbors, Naive Bayes, and Logistic Regression) have been tested to detect DDoS attacks. Also, a mitigation technique has been used to eliminate the attack effect on SDN. RF and KNN were selected because of their high accuracy results. Three types of network topology have been generated to observe the effectiveness of proposed algorithms on different network architectures. The results reveal that RF performs better than KNN in a single topology, and both have close performance in other topologies.
In today's digital era, the importance of securing information has reached critical levels. Steganography is one of the methods used for this purpose by hiding sensitive data within other files. This study introduces an approach utilizing a chaotic dynamic system as a random key generator, governing both the selection of hiding locations within an image and the amount of data concealed in each location. The security of the steganography approach is considerably improved by using this random procedure. A 3D dynamic system with nine parameters influencing its behavior was carefully chosen. For each parameter, suitable interval values were determined to guarantee the system's chaotic behavior. Analysis of chaotic performance is given using the
... Show More
Abstract
This research deals with Building A probabilistic Linear programming model representing, the operation of production in the Middle Refinery Company (Dura, Semawa, Najaif) Considering the demand of each product (Gasoline, Kerosene,Gas Oil, Fuel Oil ).are random variables ,follows certain probability distribution, which are testing by using Statistical programme (Easy fit), thes distribution are found to be Cauchy distribution ,Erlang distribution ,Pareto distribution ,Normal distribution ,and General Extreme value distribution . &
... Show MoreThis study delves into the realm of advanced cooling techniques by examining the performance of a two-stage parallel flow indirect evaporative cooling system enhanced with aspen pads in the challenging climate of Baghdad. The objective was to achieve average air dry bulb temperatures (43 oC) below the ambient wet bulb temperatures (24.95 oC) with an average relative humidity of 23%, aiming for unparalleled cooling efficiency. The research experiment was carried out in the urban environment of Baghdad, characterized by high temperature conditions. The investigation focused on the potential of the two-stage parallel flow setup, combined with the cooling capability of aspen pads, to surpass the limitat
... Show MoreMassive multiple-input multiple-output (massive-MIMO) is considered as the key technology to meet the huge demands of data rates in the future wireless communications networks. However, for massive-MIMO systems to realize their maximum potential gain, sufficiently accurate downlink (DL) channel state information (CSI) with low overhead to meet the short coherence time (CT) is required. Therefore, this article aims to overcome the technical challenge of DL CSI estimation in a frequency-division-duplex (FDD) massive-MIMO with short CT considering five different physical correlation models. To this end, the statistical structure of the massive-MIMO channel, which is captured by the physical correlation is exploited to find sufficiently
... Show MoreThe power generation of solar photovoltaic (PV) technology is being implemented in every nation worldwide due to its environmentally clean characteristics. Therefore, PV technology is significantly growing in the present applications and usage of PV power systems. Despite the strength of the PV arrays in power systems, the arrays remain susceptible to certain faults. An effective supply requires economic returns, the security of the equipment and humans, precise fault identification, diagnosis, and interruption tools. Meanwhile, the faults in unidentified arc lead to serious fire hazards to commercial, residential, and utility-scale PV systems. To ensure secure and dependable distribution of electricity, the detection of such ha
... Show MoreECG is an important tool for the primary diagnosis of heart diseases, which shows the electrophysiology of the heart. In our method, a single maternal abdominal ECG signal is taken as an input signal and the maternal P-QRS-T complexes of original signal is averaged and repeated and taken as a reference signal. LMS and RLS adaptive filters algorithms are applied. The results showed that the fetal ECGs have been successfully detected. The accuracy of Daisy database was up to 84% of LMS and 88% of RLS while PhysioNet was up to 98% and 96% for LMS and RLS respectively.
The aim of the present study was to distinguish between healthy children and those with epilepsy by electroencephalography (EEG). Two biomarkers including Hurst exponents (H) and Tsallis entropy (TE) were used to investigate the background activity of EEG of 10 healthy children and 10 with epilepsy. EEG artifacts were removed using Savitzky-Golay (SG) filter. As it hypothesize, there was a significant changes in irregularity and complexity in epileptic EEG in comparison with healthy control subjects using t-test (p< 0.05). The increasing in complexity changes were observed in H and TE results of epileptic subjects make them suggested EEG biomarker associated with epilepsy and a reliable tool for detection and identification of this di
... Show MoreIt is the regression analysis is the foundation stone of knowledge of statistics , which mostly depends on the ordinary least square method , but as is well known that the way the above mentioned her several conditions to operate accurately and the results can be unreliable , add to that the lack of certain conditions make it impossible to complete the work and analysis method and among those conditions are the multi-co linearity problem , and we are in the process of detected that problem between the independent variables using farrar –glauber test , in addition to the requirement linearity data and the lack of the condition last has been resorting to the
... Show MoreThe recent emergence of sophisticated Large Language Models (LLMs) such as GPT-4, Bard, and Bing has revolutionized the domain of scientific inquiry, particularly in the realm of large pre-trained vision-language models. This pivotal transformation is driving new frontiers in various fields, including image processing and digital media verification. In the heart of this evolution, our research focuses on the rapidly growing area of image authenticity verification, a field gaining immense relevance in the digital era. The study is specifically geared towards addressing the emerging challenge of distinguishing between authentic images and deep fakes – a task that has become critically important in a world increasingly reliant on digital med
... Show More