With the proliferation of both Internet access and data traffic, recent breaches have brought into sharp focus the need for Network Intrusion Detection Systems (NIDS) to protect networks from more complex cyberattacks. To differentiate between normal network processes and possible attacks, Intrusion Detection Systems (IDS) often employ pattern recognition and data mining techniques. Network and host system intrusions, assaults, and policy violations can be automatically detected and classified by an Intrusion Detection System (IDS). Using Python Scikit-Learn the results of this study show that Machine Learning (ML) techniques like Decision Tree (DT), Naïve Bayes (NB), and K-Nearest Neighbor (KNN) can enhance the effectiveness of an Intrusion Detection System (IDS). Success is measured by a variety of metrics, including accuracy, precision, recall, F1-Score, and execution time. Applying feature selection approaches such as Analysis of Variance (ANOVA), Mutual Information (MI), and Chi-Square (Ch-2) reduced execution time, increased detection efficiency and accuracy, and boosted overall performance. All classifiers achieve the greatest performance with 99.99% accuracy and the shortest computation time of 0.0089 seconds while using ANOVA with 10% of features.
Data mining has the most important role in healthcare for discovering hidden relationships in big datasets, especially in breast cancer diagnostics, which is the most popular cause of death in the world. In this paper two algorithms are applied that are decision tree and K-Nearest Neighbour for diagnosing Breast Cancer Grad in order to reduce its risk on patients. In decision tree with feature selection, the Gini index gives an accuracy of %87.83, while with entropy, the feature selection gives an accuracy of %86.77. In both cases, Age appeared as the most effective parameter, particularly when Age<49.5. Whereas Ki67 appeared as a second effective parameter. Furthermore, K- Nearest Neighbor is based on the minimu
... Show MoreIt is the regression analysis is the foundation stone of knowledge of statistics , which mostly depends on the ordinary least square method , but as is well known that the way the above mentioned her several conditions to operate accurately and the results can be unreliable , add to that the lack of certain conditions make it impossible to complete the work and analysis method and among those conditions are the multi-co linearity problem , and we are in the process of detected that problem between the independent variables using farrar –glauber test , in addition to the requirement linearity data and the lack of the condition last has been resorting to the
... Show MoreThe recent emergence of sophisticated Large Language Models (LLMs) such as GPT-4, Bard, and Bing has revolutionized the domain of scientific inquiry, particularly in the realm of large pre-trained vision-language models. This pivotal transformation is driving new frontiers in various fields, including image processing and digital media verification. In the heart of this evolution, our research focuses on the rapidly growing area of image authenticity verification, a field gaining immense relevance in the digital era. The study is specifically geared towards addressing the emerging challenge of distinguishing between authentic images and deep fakes – a task that has become critically important in a world increasingly reliant on digital med
... Show MoreECG is an important tool for the primary diagnosis of heart diseases, which shows the electrophysiology of the heart. In our method, a single maternal abdominal ECG signal is taken as an input signal and the maternal P-QRS-T complexes of original signal is averaged and repeated and taken as a reference signal. LMS and RLS adaptive filters algorithms are applied. The results showed that the fetal ECGs have been successfully detected. The accuracy of Daisy database was up to 84% of LMS and 88% of RLS while PhysioNet was up to 98% and 96% for LMS and RLS respectively.
The aim of the present study was to distinguish between healthy children and those with epilepsy by electroencephalography (EEG). Two biomarkers including Hurst exponents (H) and Tsallis entropy (TE) were used to investigate the background activity of EEG of 10 healthy children and 10 with epilepsy. EEG artifacts were removed using Savitzky-Golay (SG) filter. As it hypothesize, there was a significant changes in irregularity and complexity in epileptic EEG in comparison with healthy control subjects using t-test (p< 0.05). The increasing in complexity changes were observed in H and TE results of epileptic subjects make them suggested EEG biomarker associated with epilepsy and a reliable tool for detection and identification of this di
... Show MoreIn this study, the Earth's surface was studied in Razzaza Lake for 25 years, using remote sensing methods. Images of the satellites Landsat 5 (TM) and 8 (OLI) were used to study and determine the components of the land cover. The study covered the years 1995-2021 with an interval of 5 years, as this region is uninhabited, so the change in the land cover is slow. The land cover was divided into three main classes and seven subclasses and classified using the maximum likelihood classifier with the help of training sets collected to represent the classes that made up the land cover. The changes detected in the land cover were studied by considering 1995 as a reference year. It was found that there was a significant reduction in the water mass
... Show MoreThe usual methods of distance determination in Astronomy parallax and Spectroscopic with Expansion Methods are seldom applicable to Nebulae. In this work determination of the distances to individual Nebulae are calculated and discussed. The distances of Nebulae to the Earth are calculated. The accuracy of the distance is tested by using Aladin sky Atlas, and comparing Nebulae properties were derived from these distance made with statistical distance determination. The results showed that angular Expansions may occur in a part of the nebulae that is moving at a velocity different than the observed velocity. Also the results of the comparison of our spectroscopic distances with the trig
Samuel Beckett’s Happy Days (1961) clearly portrays a lack of communication among the characters of the play which refers to the condition of modern man. This failure of communication led Samuel Beckett to use a lot of pauses and silences in all plays written instead of using words. To express the bewilderment of the modern man during the 20th century, Beckett adopts the use of no language strategy in the dramatic works. After World War II, people were without hope, religion, food, jobs, homes, or even countries. Beckett gave them a voice. He used a dramatic language out of everyday things, in which silence was part of the syntax as a poetic repetition. Language is no more important to the modern man; instead, he us
... Show MoreThe paper deals with the language of Russian folklore. Folklore is a unique sphere of existence of the language, the most vivid expression of the national mentality. The folklore word embodied the perception and evaluation of the surrounding world. “What did the word in general mean for the life of the people? The word was equated ... with life itself. The word generated and explained life, it was ... the keeper of memory and the guarantee of the infinity of the future. The folklore text is studied by literary critics, ethnographers, historians, culturologists, and art historians. In the twentieth century, a new science emerged - linguo-folkloristics, the goals and objectives of which were formulated by A.T. Khrolenko only in the seven
... Show More