Anomaly detection is still a difficult task. To address this problem, we propose to strengthen DBSCAN algorithm for the data by converting all data to the graph concept frame (CFG). As is well known that the work DBSCAN method used to compile the data set belong to the same species in a while it will be considered in the external behavior of the cluster as a noise or anomalies. It can detect anomalies by DBSCAN algorithm can detect abnormal points that are far from certain set threshold (extremism). However, the abnormalities are not those cases, abnormal and unusual or far from a specific group, There is a type of data that is do not happen repeatedly, but are considered abnormal for the group of known. The analysis showed DBSCAN using the improved algorithm can detect this type of anomaly. Thus, our approach is effective in finding abnormalities.
An automatic text summarization system mimics how humans summarize by picking the most significant sentences in a source text. However, the complexities of the Arabic language have become challenging to obtain information quickly and effectively. The main disadvantage of the traditional approaches is that they are strictly constrained (especially for the Arabic language) by the accuracy of sentence feature functions, weighting schemes, and similarity calculations. On the other hand, the meta-heuristic search approaches have a feature tha
... Show MoreAlthough text document images authentication is difficult due to the binary nature and clear separation between the background and foreground but it is getting higher demand for many applications. Most previous researches in this field depend on insertion watermark in the document, the drawback in these techniques lie in the fact that changing pixel values in a binary document could introduce irregularities that are very visually noticeable. In this paper, a new method is proposed for object-based text document authentication, in which I propose a different approach where a text document is signed by shifting individual words slightly left or right from their original positions to make the center of gravity for each line fall in with the m
... Show MoreCrime is a threat to any nation’s security administration and jurisdiction. Therefore, crime analysis becomes increasingly important because it assigns the time and place based on the collected spatial and temporal data. However, old techniques, such as paperwork, investigative judges, and statistical analysis, are not efficient enough to predict the accurate time and location where the crime had taken place. But when machine learning and data mining methods were deployed in crime analysis, crime analysis and predication accuracy increased dramatically. In this study, various types of criminal analysis and prediction using several machine learning and data mining techniques, based o
With the development of communication technologies for mobile devices and electronic communications, and went to the world of e-government, e-commerce and e-banking. It became necessary to control these activities from exposure to intrusion or misuse and to provide protection to them, so it's important to design powerful and efficient systems-do-this-purpose. It this paper it has been used several varieties of algorithm selection passive immune algorithm selection passive with real values, algorithm selection with passive detectors with a radius fixed, algorithm selection with passive detectors, variable- sized intrusion detection network type misuse where the algorithm generates a set of detectors to distinguish the self-samples. Practica
... Show MoreCybersecurity refers to the actions that are used by people and companies to protect themselves and their information from cyber threats. Different security methods have been proposed for detecting network abnormal behavior, but some effective attacks are still a major concern in the computer community. Many security gaps, like Denial of Service, spam, phishing, and other types of attacks, are reported daily, and the attack numbers are growing. Intrusion detection is a security protection method that is used to detect and report any abnormal traffic automatically that may affect network security, such as internal attacks, external attacks, and maloperations. This paper proposed an anomaly intrusion detection system method based on a
... Show MoreCurrent study obtained (75) isolate of Pseudomonas aeruginosa collected from different cases included : 28 isolates from otitis media, 23 isolates from burn infections, 10 isolates from wound infections, 8 isolates from urinary tract infections and 6 isolates from blood, during the period between 1/9/2014 to 1/11/2014
The result revealed that the tox A gene was present in 54 isolates (72%) of Pseudomonas aeruginosa. The gel electrophoresis showed that the molecular weight of tox A gene was 352 bp. The result shows 17 isolates (60.71%) from otitis media has tox A gene, 1
... Show MoreData centric techniques, like data aggregation via modified algorithm based on fuzzy clustering algorithm with voronoi diagram which is called modified Voronoi Fuzzy Clustering Algorithm (VFCA) is presented in this paper. In the modified algorithm, the sensed area divided into number of voronoi cells by applying voronoi diagram, these cells are clustered by a fuzzy C-means method (FCM) to reduce the transmission distance. Then an appropriate cluster head (CH) for each cluster is elected. Three parameters are used for this election process, the energy, distance between CH and its neighbor sensors and packet loss values. Furthermore, data aggregation is employed in each CH to reduce the amount of data transmission which le
... Show MoreRegarding to the computer system security, the intrusion detection systems are fundamental components for discriminating attacks at the early stage. They monitor and analyze network traffics, looking for abnormal behaviors or attack signatures to detect intrusions in early time. However, many challenges arise while developing flexible and efficient network intrusion detection system (NIDS) for unforeseen attacks with high detection rate. In this paper, deep neural network (DNN) approach was proposed for anomaly detection NIDS. Dropout is the regularized technique used with DNN model to reduce the overfitting. The experimental results applied on NSL_KDD dataset. SoftMax output layer has been used with cross entropy loss funct
... Show MoreSteganography can be defined as the art and science of hiding information in the data that could be read by computer. This science cannot recognize stego-cover and the original one whether by eye or by computer when seeing the statistical samples. This paper presents a new method to hide text in text characters. The systematic method uses the structure of invisible character to hide and extract secret texts. The creation of secret message comprises four main stages such using the letter from the original message, selecting the suitable cover text, dividing the cover text into blocks, hiding the secret text using the invisible character and comparing the cover-text and stego-object. This study uses an invisible character (white space
... Show MoreA microbial study conducted for a number of flour samples (30 samples) Uses in the bakery ovens in various areas of the city of Baghdad, by used the conventional methods used in laboratories in microbial tests and compared with the modern techniqueby usedof BacTrac Device 3400 equipped from SY-LAB Impedance analysersAustrian company.The results of two ways showed (The conventional way and BacTrac Device test)that the total counts of aerobic bacteria, coliform bacteria, StaphylococcusSpp. bacteria, Bacillus cereus bacteria and yeasts and molds,Most of them were within the permissible borders in the Iraqi standard for grain and its products With free samples from SalmonellaSpp. bacteria, and that the screening by BacTrac device are shorten
... Show More