With the high usage of computers and networks in the current time, the amount of security threats is increased. The study of intrusion detection systems (IDS) has received much attention throughout the computer science field. The main objective of this study is to examine the existing literature on various approaches for Intrusion Detection. This paper presents an overview of different intrusion detection systems and a detailed analysis of multiple techniques for these systems, including their advantages and disadvantages. These techniques include artificial neural networks, bio-inspired computing, evolutionary techniques, machine learning, and pattern recognition.
This study aimed to determine obesity level of some population in Baghdad by using Bio-electrical impedance analysis (BIA) and compared with anthropometric measurements such as body mass index (BMI), waist circumference (WC) and waist-to-hip ratio (WHR). Statistical analysis results of linear correlation coefficients for obesity indicators showed that BIA correlation 0.92 was most significant and reliable for obesity measurement.
Results of BIA method for age group 20-29 years showed that 44.4% of females were healthy body while 37.8% of males suffer from increased body fat. Results of age group 30-39 year showed that 32.6 of females were in healthy body and 42% of males were obese. In case age group 40-4
... Show MoreEfficient management of treated sewage effluents protects the environment and reuse of municipal, industrial, agricultural and recreational as compensation for water shortages as a second source of water. This study was conducted to investigate the overall performance and evaluate the effluent quality from Al- Rustamiya sewage treatment plant (STP), Baghdad, Iraq by determining the effluent quality index (EQI). This assessment included daily records of major influent and effluent sewage parameters that were obtained from the municipal sewage plant laboratory recorded from January 2011 to December 2018. The result showed that the treated sewage effluent quality from STP was within the Iraqi quality standards (IQS) for disposal and t
... Show MoreTransmission lines are generally subjected to faults, so it is advantageous to determine these faults as quickly as possible. This study uses an Artificial Neural Network technique to locate a fault as soon as it happens on the Doukan-Erbil of 132kv double Transmission lines network. CYME 7.1-Programming/Simulink utilized simulation to model the suggested network. A multilayer perceptron feed-forward artificial neural network with a back propagation learning algorithm is used for the intelligence locator's training, testing, assessment, and validation. Voltages and currents were applied as inputs during the neural network's training. The pre-fault and post-fault values determined the scaled values. The neural network's p
... Show MoreCloud computing (CC) is a fast-growing technology that offers computers, networking, and storage services that can be accessed and used over the internet. Cloud services save users money because they are pay-per-use, and they save time because they are on-demand and elastic, a unique aspect of cloud computing. However, several security issues must be addressed before users store data in the cloud. Because the user will have no direct control over the data that has been outsourced to the cloud, particularly personal and sensitive data (health, finance, military, etc.), and will not know where the data is stored, the user must ensure that the cloud stores and maintains the outsourced data appropriately. The study's primary goals are to mak
... Show MoreThe evolution of the Internet of things (IoT) led to connect billions of heterogeneous physical devices together to improve the quality of human life by collecting data from their environment. However, there is a need to store huge data in big storage and high computational capabilities. Cloud computing can be used to store big data. The data of IoT devices is transferred using two types of protocols: Message Queuing Telemetry Transport (MQTT) and Hypertext Transfer Protocol (HTTP). This paper aims to make a high performance and more reliable system through efficient use of resources. Thus, load balancing in cloud computing is used to dynamically distribute the workload across nodes to avoid overloading any individual r
... Show MoreDatabase is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show MoreObjective: To know the impact of social networks on the mental health of adolescents in the city of Diwaniyah.
Methodology: A descriptive cross-sectional study was conducted on adolescents in preparatory schools in ALDiwaniyah
City Center, for the period from Jun 26, 2015 through to October 20, 2015. The schools were
selected from using Probability sampling (240 random samples) six schools were selected from 32 schools (20 %
from total number) the schools were chosen by writing the names of all schools on a pieces of paper and put in
bags. Then, selected six schools random, three boys schools (2 preparatory and 1 secondary) three girls schools
(2 preparatory and 1 secondary), then I chose the sample the students in grad
Metaheuristics under the swarm intelligence (SI) class have proven to be efficient and have become popular methods for solving different optimization problems. Based on the usage of memory, metaheuristics can be classified into algorithms with memory and without memory (memory-less). The absence of memory in some metaheuristics will lead to the loss of the information gained in previous iterations. The metaheuristics tend to divert from promising areas of solutions search spaces which will lead to non-optimal solutions. This paper aims to review memory usage and its effect on the performance of the main SI-based metaheuristics. Investigation has been performed on SI metaheuristics, memory usage and memory-less metaheuristics, memory char
... Show MoreIn this Paper, we proposed two new predictor corrector methods for solving Kepler's equation in hyperbolic case using quadrature formula which plays an important and significant rule in the evaluation of the integrals. The two procedures are developed that, in two or three iterations, solve the hyperbolic orbit equation in a very efficient manner, and to an accuracy that proves to be always better than 10-15. The solution is examined with and with grid size , using the first guesses hyperbolic eccentric anomaly is and , where is the eccentricity and is the hyperbolic mean anomaly.
Researchers have increased interest in recent years in determining the optimum sample size to obtain sufficient accuracy and estimation and to obtain high-precision parameters in order to evaluate a large number of tests in the field of diagnosis at the same time. In this research, two methods were used to determine the optimum sample size to estimate the parameters of high-dimensional data. These methods are the Bennett inequality method and the regression method. The nonlinear logistic regression model is estimated by the size of each sampling method in high-dimensional data using artificial intelligence, which is the method of artificial neural network (ANN) as it gives a high-precision estimate commensurate with the dat
... Show More