In this paper, the botnet detection problem is defined as a feature selection problem and the genetic algorithm (GA) is used to search for the best significant combination of features from the entire search space of set of features. Furthermore, the Decision Tree (DT) classifier is used as an objective function to direct the ability of the proposed GA to locate the combination of features that can correctly classify the activities into normal traffics and botnet attacks. Two datasets namely the UNSW-NB15 and the Canadian Institute for Cybersecurity Intrusion Detection System 2017 (CICIDS2017), are used as evaluation datasets. The results reveal that the proposed DT-aware GA can effectively find the relevant features from the whole features set. Thus, it obtains efficient botnet detection results in terms of F-score, precision, detection rate, and number of relevant features, when compared with DT alone.
An automatic text summarization system mimics how humans summarize by picking the most significant sentences in a source text. However, the complexities of the Arabic language have become challenging to obtain information quickly and effectively. The main disadvantage of the traditional approaches is that they are strictly constrained (especially for the Arabic language) by the accuracy of sentence feature functions, weighting schemes, and similarity calculations. On the other hand, the meta-heuristic search approaches have a feature tha
... Show MoreBackground: Klebsiella pneumoniae were considered as normal flora of skin, and intestine. It can cause damage to human lungs; the danger of this bacterium is related to exposure to the hospital surroundings. materials and methods: the detection of Klebsiella pneumoniae on morphological and biochemical tests and then assured with VITEK 2 system. Resistance to antibiotics was determined by Kirby-Baeur method. And genotyping of IMP-1 in isolates was done by PCR technique, then biofilm formation was identified by Micro titer plate method. Results: The present study included a collecting of 50 specimens from different clinical specimens, (blood 40%, urine 30%, sputum 20%, wound infection 10%); 10 isolates were identified as K
... Show MoreIn this research, the focus was placed on estimating the parameters of the Hypoexponential distribution function using the maximum likelihood method and genetic algorithm. More than one standard, including MSE, has been adopted for comparison by Using the simulation method
Until recently, researchers have utilized and applied various techniques for intrusion detection system (IDS), including DNA encoding and clustering that are widely used for this purpose. In addition to the other two major techniques for detection are anomaly and misuse detection, where anomaly detection is done based on user behavior, while misuse detection is done based on known attacks signatures. However, both techniques have some drawbacks, such as a high false alarm rate. Therefore, hybrid IDS takes advantage of combining the strength of both techniques to overcome their limitations. In this paper, a hybrid IDS is proposed based on the DNA encoding and clustering method. The proposed DNA encoding is done based on the UNSW-NB15
... Show MoreCommunity detection is an important and interesting topic for better understanding and analyzing complex network structures. Detecting hidden partitions in complex networks is proven to be an NP-hard problem that may not be accurately resolved using traditional methods. So it is solved using evolutionary computation methods and modeled in the literature as an optimization problem. In recent years, many researchers have directed their research efforts toward addressing the problem of community structure detection by developing different algorithms and making use of single-objective optimization methods. In this study, we have continued that research line by improving the Particle Swarm Optimization (PSO) algorithm using a
... Show MoreThe aim of this study is to develop a novel framework for managing risks in smart supply chains by enhancing business continuity and resilience against potential disruptions. This research addresses the growing uncertainty in supply chain environments, driven by both natural phenomena-such as pandemics and earthquakes—and human-induced events, including wars, political upheavals, and societal transformations. Recognizing that traditional risk management approaches are insufficient in such dynamic contexts, the study proposes an adaptive framework that integrates proactive and remedial measures for effective risk mitigation. A fuzzy risk matrix is employed to assess and analyze uncertainties, facilitating the identification of disr
... Show MorePathology reports are necessary for specialists to make an appropriate diagnosis of diseases in general and blood diseases in particular. Therefore, specialists check blood cells and other blood details. Thus, to diagnose a disease, specialists must analyze the factors of the patient’s blood and medical history. Generally, doctors have tended to use intelligent agents to help them with CBC analysis. However, these agents need analytical tools to extract the parameters (CBC parameters) employed in the prediction of the development of life-threatening bacteremia and offer prognostic data. Therefore, this paper proposes an enhancement to the Rabin–Karp algorithm and then mixes it with the fuzzy ratio to make this algorithm suitable
... Show More