Intrusion detection systems detect attacks inside computers and networks, where the detection of the attacks must be in fast time and high rate. Various methods proposed achieved high detection rate, this was done either by improving the algorithm or hybridizing with another algorithm. However, they are suffering from the time, especially after the improvement of the algorithm and dealing with large traffic data. On the other hand, past researches have been successfully applied to the DNA sequences detection approaches for intrusion detection system; the achieved detection rate results were very low, on other hand, the processing time was fast. Also, feature selection used to reduce the computation and complexity lead to speed up the system. A new features selection method is proposed based on DNA encoding and on DNA keys positions. The current system has three phases, the first phase, is called pre-processing phase, which is used to extract the keys and their positions, the second phase is training phase; the main goal of this phase is to select features based on the key positions that gained from pre-processing phase, and the third phase is the testing phase, which classified the network traffic records as either normal or attack by using specific features. The performance is calculated based on the detection rate, false alarm rate, accuracy, and also on the time that include both encoding time and matching time. All these results are based on using two or three keys, and it is evaluated by using two datasets, namely, KDD Cup 99, and NSL-KDD. The achieved detection rate, false alarm rate, accuracy, encoding time, and matching time for all corrected KDD Cup records (311,029 records) by using two and three keys are equal to 96.97, 33.67, 91%, 325, 13 s, and 92.74, 7.41, 92.71%, 325 and 20 s, respectively. The results for detection rate, false alarm rate, accuracy, encoding time, and matching time for all NSL-KDD records (22,544 records) by using two and three keys are equal to 89.34, 28.94, 81.46%, 20, 1 s and 82.93, 11.40, 85.37%, 20 and 1 s, respectively. The proposed system is evaluated and compared with previous systems and these comparisons are done based on encoding time and matching time. The outcomes showed that the detection results of the present system are faster than the previous ones.
The recent emergence of sophisticated Large Language Models (LLMs) such as GPT-4, Bard, and Bing has revolutionized the domain of scientific inquiry, particularly in the realm of large pre-trained vision-language models. This pivotal transformation is driving new frontiers in various fields, including image processing and digital media verification. In the heart of this evolution, our research focuses on the rapidly growing area of image authenticity verification, a field gaining immense relevance in the digital era. The study is specifically geared towards addressing the emerging challenge of distinguishing between authentic images and deep fakes – a task that has become critically important in a world increasingly reliant on digital med
... Show MoreToday in the digital realm, where images constitute the massive resource of the social media base but unfortunately suffer from two issues of size and transmission, compression is the ideal solution. Pixel base techniques are one of the modern spatially optimized modeling techniques of deterministic and probabilistic bases that imply mean, index, and residual. This paper introduces adaptive pixel-based coding techniques for the probabilistic part of a lossy scheme by incorporating the MMSA of the C321 base along with the utilization of the deterministic part losslessly. The tested results achieved higher size reduction performance compared to the traditional pixel-based techniques and the standard JPEG by about 40% and 50%,
... Show MoreThis study investigates the feasibility of a mobile robot navigating and discovering its location in unknown environments, followed by the creation of maps of these navigated environments for future use. First, a real mobile robot named TurtleBot3 Burger was used to achieve the simultaneous localization and mapping (SLAM) technique for a complex environment with 12 obstacles of different sizes based on the Rviz library, which is built on the robot operating system (ROS) booted in Linux. It is possible to control the robot and perform this process remotely by using an Amazon Elastic Compute Cloud (Amazon EC2) instance service. Then, the map to the Amazon Simple Storage Service (Amazon S3) cloud was uploaded. This provides a database
... Show More1.Chapter I (systematic framework) which includes: the research problem and the importance of the research, the need for it, the goals of the research, the temporal &spatial boundaries, determine the terms and defined procedurally.2.Chapter II - the theoretical framework: It consists of three sections are:•The first topic:- the concept of references and experimentation in the theater. •The second topic:- the director of academic and experimentation in Iraq. Two paragraphs in this section came after the introduction, in first paragraph to talk about the Iraqi theater academic and experimentation, and in the second paragraph the researcher spoke about the academic director of the Iraqi and experimentation. 3.Chapter III - Actions -
... Show MoreBackground: For many decades, the ECG was the
workhorse of non-invasive cardiac test and today although
other techniques provide more details about the structural
anomalies in congenital heart diseases, ECG is likely to be
part of clinical evaluation of patients with such diseases
because it is inexpensive, easy to perform and in certain
situations may be both sensitive and specific.
Objective: this study carried out to identify the pattern of
ECG study in patients with TOF.
Methods: this is a retrospective study of 200 patients
with TOF, referred to Ibn Al-Bitar cardiac center from
April 1993 to May 1999. The diagnosis of TOF established
by echocrdiographic, catheterization and angiographic
study.
Objective: A descriptive design, using the methodological approach, is carried throughout the present
study from April 1st 2012 to May 20th 2013 to construct the school physical environment standardized
features tool.
Methodology: An instrument of (141) item is constructed for the purpose of the study. A purposive
sample of (44) school; (22) public and (22) private ones is selected. Content Validity of the instrument is
determined through the use of panel of (11) expert who are specialists in Community Health Nursing and
Community Medicine. Internal consistency reliability, using the split-half technique, is employed through
the computation of Cronbach alpha correlation coefficient of (0.93) for internal scale. Data
This work presents a five-period chaotic system called the Duffing system, in which the effect of changing the initial conditions and system parameters d, g and w, on the behavior of the chaotic system, is studied. This work provides a complete analysis of system properties such as time series, attractors, and Fast Fourier Transformation Spectrum (FFT). The system shows periodic behavior when the initial conditions xi and yi equal 0.8 and 0, respectively, then the system becomes quasi-chaotic when the initial conditions xi and yi equal 0 and 0, and when the system parameters d, g and w equal 0.02, 8 and 0.09. Finally, the system exhibits hyperchaotic behavior at the first two conditions, 0 and 0, and the bandwidth of the chaotic
... Show More