Botnet detection develops a challenging problem in numerous fields such as order, cybersecurity, law, finance, healthcare, and so on. The botnet signifies the group of co-operated Internet connected devices controlled by cyber criminals for starting co-ordinated attacks and applying various malicious events. While the botnet is seamlessly dynamic with developing counter-measures projected by both network and host-based detection techniques, the convention techniques are failed to attain sufficient safety to botnet threats. Thus, machine learning approaches are established for detecting and classifying botnets for cybersecurity. This article presents a novel dragonfly algorithm with multi-class support vector machines enabled botnet detection for information security. For effectual recognition of botnets, the proposed model involves data pre-processing at the initial stage. Besides, the model is utilized for the identification and classification of botnets that exist in the network. In order to optimally adjust the SVM parameters, the DFA is utilized and consequently resulting in enhanced outcomes. The presented model has the ability in accomplishing improved botnet detection performance. A wide-ranging experimental analysis is performed and the results are inspected under several aspects. The experimental results indicated the efficiency of our model over existing methods.
In this paper, new brain tumour detection method is discovered whereby the normal slices are disassembled from the abnormal ones. Three main phases are deployed including the extraction of the cerebral tissue, the detection of abnormal block and the mechanism of fine-tuning and finally the detection of abnormal slice according to the detected abnormal blocks. Through experimental tests, progress made by the suggested means is assessed and verified. As a result, in terms of qualitative assessment, it is found that the performance of proposed method is satisfactory and may contribute to the development of reliable MRI brain tumour diagnosis and treatments.
The aim of the present study was to distinguish between healthy children and those with epilepsy by electroencephalography (EEG). Two biomarkers including Hurst exponents (H) and Tsallis entropy (TE) were used to investigate the background activity of EEG of 10 healthy children and 10 with epilepsy. EEG artifacts were removed using Savitzky-Golay (SG) filter. As it hypothesize, there was a significant changes in irregularity and complexity in epileptic EEG in comparison with healthy control subjects using t-test (p< 0.05). The increasing in complexity changes were observed in H and TE results of epileptic subjects make them suggested EEG biomarker associated with epilepsy and a reliable tool for detection and identification of this di
... Show MoreSpraying pesticides is one of the most common procedures that is conducted to control pests. However, excessive use of these chemicals inversely affects the surrounding environments including the soil, plants, animals, and the operator itself. Therefore, researchers have been encouraged to...
Recently, the phenomenon of the spread of fake news or misinformation in most fields has taken on a wide resonance in societies. Combating this phenomenon and detecting misleading information manually is rather boring, takes a long time, and impractical. It is therefore necessary to rely on the fields of artificial intelligence to solve this problem. As such, this study aims to use deep learning techniques to detect Arabic fake news based on Arabic dataset called the AraNews dataset. This dataset contains news articles covering multiple fields such as politics, economy, culture, sports and others. A Hybrid Deep Neural Network has been proposed to improve accuracy. This network focuses on the properties of both the Text-Convolution Neural
... Show MoreThe recent emergence of sophisticated Large Language Models (LLMs) such as GPT-4, Bard, and Bing has revolutionized the domain of scientific inquiry, particularly in the realm of large pre-trained vision-language models. This pivotal transformation is driving new frontiers in various fields, including image processing and digital media verification. In the heart of this evolution, our research focuses on the rapidly growing area of image authenticity verification, a field gaining immense relevance in the digital era. The study is specifically geared towards addressing the emerging challenge of distinguishing between authentic images and deep fakes – a task that has become critically important in a world increasingly reliant on digital med
... Show MoreCNC machines are widely used in production fields since they produce similar parts in a minimum time, at higher speed and with possibly minimum error. A control system is designed, implemented and tested to control the operation of a laboratory CNC milling machine having three axes that are moved by using a stepper motor attached to each axis. The control system includes two parts, hardware part and software part, the hardware part used a PC (works as controller) connected to the CNC machine through its parallel port by using designed interface circuit. The software part includes the algorithms needed to control the CNC. The sample needs to be machined is drawn by using one of the drawing software like AUTOCAD or 3D MAX and is saved in a we
... Show MoreGeneralized Additive Model has been considered as a multivariate smoother that appeared recently in Nonparametric Regression Analysis. Thus, this research is devoted to study the mixed situation, i.e. for the phenomena that changes its behaviour from linear (with known functional form) represented in parametric part, to nonlinear (with unknown functional form: here, smoothing spline) represented in nonparametric part of the model. Furthermore, we propose robust semiparametric GAM estimator, which compared with two other existed techniques.
The Dagum Regression Model, introduced to address limitations in traditional econometric models, provides enhanced flexibility for analyzing data characterized by heavy tails and asymmetry, which is common in income and wealth distributions. This paper develops and applies the Dagum model, demonstrating its advantages over other distributions such as the Log-Normal and Gamma distributions. The model's parameters are estimated using Maximum Likelihood Estimation (MLE) and the Method of Moments (MoM). A simulation study evaluates both methods' performance across various sample sizes, showing that MoM tends to offer more robust and precise estimates, particularly in small samples. These findings provide valuable insights into the ana
... Show More