Software-defined networking (SDN) presents novel security and privacy risks, including distributed denial-of-service (DDoS) attacks. In response to these threats, machine learning (ML) and deep learning (DL) have emerged as effective approaches for quickly identifying and mitigating anomalies. To this end, this research employs various classification methods, including support vector machines (SVMs), K-nearest neighbors (KNNs), decision trees (DTs), multiple layer perceptron (MLP), and convolutional neural networks (CNNs), and compares their performance. CNN exhibits the highest train accuracy at 97.808%, yet the lowest prediction accuracy at 90.08%. In contrast, SVM demonstrates the highest prediction accuracy of 95.5%. As such, an SVM-based DDoS detection model shows superior performance. This comparative analysis offers a valuable insight into the development of efficient and accurate techniques for detecting DDoS attacks in SDN environments with less complexity and time.
Background: Atherosclerosis is well known related to age and certain cardiovascular diseases. Aging is one reason of arteries function deterioration which can cause loss of compliance and plaque accumulation, this effect increases by the presence of certain diseases such as hypertension and diabetes disease. Aim: To investigate the reduction of blood supply to the brain in patients with diabetes and hypertension with age and the role of resistive index in the diagnosis of reduced blood flow. Method: Patients with both diseases diabetic and hypertension were classified according to their age to identify the progression of the disease and factors influencing the carotid artery blood flow. By using ultrasound and standard Doppler techniq
... Show More
We have presented the distribution of the exponentiated expanded power function (EEPF) with four parameters, where this distribution was created by the exponentiated expanded method created by the scientist Gupta to expand the exponential distribution by adding a new shape parameter to the cumulative function of the distribution, resulting in a new distribution, and this method is characterized by obtaining a distribution that belongs for the exponential family. We also obtained a function of survival rate and failure rate for this distribution, where some mathematical properties were derived, then we used the method of maximum likelihood (ML) and method least squares developed (LSD)
... Show MoreIn this research، a comparison has been made between the robust estimators of (M) for the Cubic Smoothing Splines technique، to avoid the problem of abnormality in data or contamination of error، and the traditional estimation method of Cubic Smoothing Splines technique by using two criteria of differentiation which are (MADE، WASE) for different sample sizes and disparity levels to estimate the chronologically different coefficients functions for the balanced longitudinal data which are characterized by observations obtained through (n) from the independent subjects، each one of them is measured repeatedly by group of specific time points (m)،since the frequent measurements within the subjects are almost connected an
... Show MoreThe objective of this research paper is two-fold. The first is a precise reading of the theoretical underpinnings of each of the strategic approaches: "Market approach" for (M. Porter), and the alternative resource-based approach (R B V), advocates for the idea that the two approaches are complementary. Secondly, we will discuss the possibility of combining the two competitive strategies: cost leadership and differentiation. Finally, we propose a consensual approach that we call "dual domination".
The Weibull distribution is considered one of the Type-I Generalized Extreme Value (GEV) distribution, and it plays a crucial role in modeling extreme events in various fields, such as hydrology, finance, and environmental sciences. Bayesian methods play a strong, decisive role in estimating the parameters of the GEV distribution due to their ability to incorporate prior knowledge and handle small sample sizes effectively. In this research, we compare several shrinkage Bayesian estimation methods based on the squared error and the linear exponential loss functions. They were adopted and compared by the Monte Carlo simulation method. The performance of these methods is assessed based on their accuracy and computational efficiency in estimati
... Show MoreIntroduction: Dental fear is defined as the patient’s specific reaction towards stress related to dental treatment in which the stimulus is unkn..
Radiation therapy plays an important role in improving breast cancer cases, in order to obtain an appropriateestimate of radiation doses number given to the patient after tumor removal; some methods of nonparametric regression werecompared. The Kernel method was used by Nadaraya-Watson estimator to find the estimation regression function forsmoothing data based on the smoothing parameter h according to the Normal scale method (NSM), Least Squared CrossValidation method (LSCV) and Golden Rate Method (GRM). These methods were compared by simulation for samples ofthree sizes, the method (NSM) proved to be the best according to average of Mean Squares Error criterion and the method(LSCV) proved to be the best according to Average of Mean Absolu
... Show MoreThe drones have become the focus of researchers’ attention because they enter into many details of life. The Tri-copter was chosen because it combines the advantages of the quadcopter in stability and manoeuvrability quickly. In this paper, the nonlinear Tri-copter model is entirely derived and applied three controllers; Proportional-Integral-Derivative (PID), Fractional Order PID (FOPID), and Nonlinear PID (NLPID). The tuning process for the controllers’ parameters had been tuned by using the Grey Wolf Optimization (GWO) algorithm. Then the results obtained had been compared. Where the improvement rate for the Tri-copter model of the nonlinear controller (NLPID) if compared with
In this research, the semiparametric Bayesian method is compared with the classical method to estimate reliability function of three systems : k-out of-n system, series system, and parallel system. Each system consists of three components, the first one represents the composite parametric in which failure times distributed as exponential, whereas the second and the third components are nonparametric ones in which reliability estimations depend on Kernel method using two methods to estimate bandwidth parameter h method and Kaplan-Meier method. To indicate a better method for system reliability function estimation, it has be
... Show MoreThe use of non-parametric models and subsequent estimation methods requires that many of the initial conditions that must be met to represent those models of society under study are appropriate, prompting researchers to look for more flexible models, which are represented by non-parametric models
In this study, the most important and most widespread estimations of the estimation of the nonlinear regression function were investigated using Nadaraya-Watson and Regression Local Ploynomial, which are one of the types of non-linear
... Show More