Botnet detection develops a challenging problem in numerous fields such as order, cybersecurity, law, finance, healthcare, and so on. The botnet signifies the group of co-operated Internet connected devices controlled by cyber criminals for starting co-ordinated attacks and applying various malicious events. While the botnet is seamlessly dynamic with developing counter-measures projected by both network and host-based detection techniques, the convention techniques are failed to attain sufficient safety to botnet threats. Thus, machine learning approaches are established for detecting and classifying botnets for cybersecurity. This article presents a novel dragonfly algorithm with multi-class support vector machines enabled botnet detection for information security. For effectual recognition of botnets, the proposed model involves data pre-processing at the initial stage. Besides, the model is utilized for the identification and classification of botnets that exist in the network. In order to optimally adjust the SVM parameters, the DFA is utilized and consequently resulting in enhanced outcomes. The presented model has the ability in accomplishing improved botnet detection performance. A wide-ranging experimental analysis is performed and the results are inspected under several aspects. The experimental results indicated the efficiency of our model over existing methods.
This research aims to examine the effectiveness of a teaching strategy based on the cognitive model of Daniel in the development of achievement and the motivation of learning the school mathematics among the third intermediate grade students in the light of their study of "Systems of Linear Equations”. The research was conducted in the first semester (1439/1440AH), at Saeed Ibn Almosaieb Intermediate School, in Arar, Saudi Arabia. A quasi-experimental design has been used. In addition, a (pre & post) achievement test (20 Questions) and a (pre & post) scale of learning motivation to the school mathematics (25 Items) have been applied on two groups: a control group (31Students), and an experimental group (29 Students). The resear
... Show MoreWith the escalation of cybercriminal activities, the demand for forensic investigations into these crimeshas grown significantly. However, the concept of systematic pre-preparation for potential forensicexaminations during the software design phase, known as forensic readiness, has only recently gainedattention. Against the backdrop of surging urban crime rates, this study aims to conduct a rigorous andprecise analysis and forecast of crime rates in Los Angeles, employing advanced Artificial Intelligence(AI) technologies. This research amalgamates diverse datasets encompassing crime history, varioussocio-economic indicators, and geographical locations to attain a comprehensive understanding of howcrimes manifest within the city. Lev
... Show MoreData compression offers an attractive approach to reducing communication costs using available bandwidth effectively. It makes sense to pursue research on developing algorithms that can most effectively use available network. It is also important to consider the security aspect of the data being transmitted is vulnerable to attacks. The basic aim of this work is to develop a module for combining the operation of compression and encryption on the same set of data to perform these two operations simultaneously. This is achieved through embedding encryption into compression algorithms since both cryptographic ciphers and entropy coders bear certain resemblance in the sense of secrecy. First in the secure compression module, the given text is p
... Show MoreIn the current Windows version (Vista), as in all previous versions, creating a user account without setting a password is possible. For a personal PC this might be without too much risk, although it is not recommended, even by Microsoft itself. However, for business computers it is necessary to restrict access to the computers, starting with defining a different password for every user account. For the earlier versions of Windows, a lot of resources can be found giving advice how to construct passwords of user accounts. In some extent they contain remarks concerning the suitability of their solution for Windows Vista. But all these resources are not very precise about what kind of passwords the user must use. To assess the protection of pa
... Show MoreThe concealment of data has emerged as an area of deep and wide interest in research that endeavours to conceal data in a covert and stealth manner, to avoid detection through the embedment of the secret data into cover images that appear inconspicuous. These cover images may be in the format of images or videos used for concealment of the messages, yet still retaining the quality visually. Over the past ten years, there have been numerous researches on varying steganographic methods related to images, that emphasised on payload and the quality of the image. Nevertheless, a compromise exists between the two indicators and to mediate a more favourable reconciliation for this duo is a daunting and problematic task. Additionally, the current
... Show MoreMammography is at present one of the available method for early detection of masses or abnormalities which is related to breast cancer. The most common abnormalities that may indicate breast cancer are masses and calcifications. The challenge lies in early and accurate detection to overcome the development of breast cancer that affects more and more women throughout the world. Breast cancer is diagnosed at advanced stages with the help of the digital mammogram images. Masses appear in a mammogram as fine, granular clusters, which are often difficult to identify in a raw mammogram. The incidence of breast cancer in women has increased significantly in recent years.
This paper proposes a computer aided diagnostic system for the extracti
The meniscus has a crucial function in human anatomy, and Magnetic Resonance Imaging (M.R.I.) plays an essential role in meniscus assessment. It is difficult to identify cartilage lesions using typical image processing approaches because the M.R.I. data is so diverse. An M.R.I. data sequence comprises numerous images, and the attributes area we are searching for may differ from each image in the series. Therefore, feature extraction gets more complicated, hence specifically, traditional image processing becomes very complex. In traditional image processing, a human tells a computer what should be there, but a deep learning (D.L.) algorithm extracts the features of what is already there automatically. The surface changes become valuable when
... Show More