Botnet is a malicious activity that tries to disrupt traffic of service in a server or network and causes great harm to the network. In modern years, Botnets became one of the threads that constantly evolving. IDS (intrusion detection system) is one type of solutions used to detect anomalies of networks and played an increasing role in the computer security and information systems. It follows different events in computer to decide to occur an intrusion or not, and it used to build a strategic decision for security purposes. The current paper
In recent years, the world witnessed a rapid growth in attacks on the internet which resulted in deficiencies in networks performances. The growth was in both quantity and versatility of the attacks. To cope with this, new detection techniques are required especially the ones that use Artificial Intelligence techniques such as machine learning based intrusion detection and prevention systems. Many machine learning models are used to deal with intrusion detection and each has its own pros and cons and this is where this paper falls in, performance analysis of different Machine Learning Models for Intrusion Detection Systems based on supervised machine learning algorithms. Using Python Scikit-Learn library KNN, Support Ve
... Show MoreText categorization refers to the process of grouping text or documents into classes or categories according to their content. Text categorization process consists of three phases which are: preprocessing, feature extraction and classification. In comparison to the English language, just few studies have been done to categorize and classify the Arabic language. For a variety of applications, such as text classification and clustering, Arabic text representation is a difficult task because Arabic language is noted for its richness, diversity, and complicated morphology. This paper presents a comprehensive analysis and a comparison for researchers in the last five years based on the dataset, year, algorithms and the accuracy th
... Show MoreText categorization refers to the process of grouping text or documents into classes or categories according to their content. Text categorization process consists of three phases which are: preprocessing, feature extraction and classification. In comparison to the English language, just few studies have been done to categorize and classify the Arabic language. For a variety of applications, such as text classification and clustering, Arabic text representation is a difficult task because Arabic language is noted for its richness, diversity, and complicated morphology. This paper presents a comprehensive analysis and a comparison for researchers in the last five years based on the dataset, year, algorithms and the accu
... Show MoreThe healthcare sector has traditionally been an early adopter of technological progress, gaining significant advantages, particularly in machine learning applications such as disease prediction. One of the most important diseases is stroke. Early detection of a brain stroke is exceptionally critical to saving human lives. A brain stroke is a condition that happens when the blood flow to the brain is disturbed or reduced, leading brain cells to die and resulting in impairment or death. Furthermore, the World Health Organization (WHO) classifies brain stroke as the world's second-deadliest disease. Brain stroke is still an essential factor in the healthcare sector. Controlling the risk of a brain stroke is important for the surviv
... Show MoreNatural gas and oil are one of the mainstays of the global economy. However, many issues surround the pipelines that transport these resources, including aging infrastructure, environmental impacts, and vulnerability to sabotage operations. Such issues can result in leakages in these pipelines, requiring significant effort to detect and pinpoint their locations. The objective of this project is to develop and implement a method for detecting oil spills caused by leaking oil pipelines using aerial images captured by a drone equipped with a Raspberry Pi 4. Using the message queuing telemetry transport Internet of Things (MQTT IoT) protocol, the acquired images and the global positioning system (GPS) coordinates of the images' acquisition are
... Show MoreThe proliferation of many editing programs based on artificial intelligence techniques has contributed to the emergence of deepfake technology. Deepfakes are committed to fabricating and falsifying facts by making a person do actions or say words that he never did or said. So that developing an algorithm for deepfakes detection is very important to discriminate real from fake media. Convolutional neural networks (CNNs) are among the most complex classifiers, but choosing the nature of the data fed to these networks is extremely important. For this reason, we capture fine texture details of input data frames using 16 Gabor filters indifferent directions and then feed them to a binary CNN classifier instead of using the red-green-blue
... Show MoreSocial Networking has dominated the whole world by providing a platform of information dissemination. Usually people share information without knowing its truthfulness. Nowadays Social Networks are used for gaining influence in many fields like in elections, advertisements etc. It is not surprising that social media has become a weapon for manipulating sentiments by spreading disinformation. Propaganda is one of the systematic and deliberate attempts used for influencing people for the political, religious gains. In this research paper, efforts were made to classify Propagandist text from Non-Propagandist text using supervised machine learning algorithms. Data was collected from the news sources from July 2018-August 2018. After annota
... Show MoreDue to the lack of vehicle-to-infrastructure (V2I) communication in the existing transportation systems, traffic light detection and recognition is essential for advanced driver assistant systems (ADAS) and road infrastructure surveys. Additionally, autonomous vehicles have the potential to change urban transportation by making it safe, economical, sustainable, congestion-free, and transportable in other ways. Because of their limitations, traditional traffic light detection and recognition algorithms are not able to recognize traffic lights as effectively as deep learning-based techniques, which take a lot of time and effort to develop. The main aim of this research is to propose a traffic light detection and recognition model based on
... Show MoreToday, the science of artificial intelligence has become one of the most important sciences in creating intelligent computer programs that simulate the human mind. The goal of artificial intelligence in the medical field is to assist doctors and health care workers in diagnosing diseases and clinical treatment, reducing the rate of medical error, and saving lives of citizens. The main and widely used technologies are expert systems, machine learning and big data. In the article, a brief overview of the three mentioned techniques will be provided to make it easier for readers to understand these techniques and their importance.