Preferred Language
Articles
/
Nxbp4osBVTCNdQwCl-N8
Anomaly Based Intrusion Detection System Using Hierarchical Classification and Clustering Techniques
...Show More Authors

With the rapid development of computers and network technologies, the security of information in the internet becomes compromise and many threats may affect the integrity of such information. Many researches are focused theirs works on providing solution to this threat. Machine learning and data mining are widely used in anomaly-detection schemes to decide whether or not a malicious activity is taking place on a network. In this paper a hierarchical classification for anomaly based intrusion detection system is proposed. Two levels of features selection and classification are used. In the first level, the global feature vector for detection the basic attacks (DoS, U2R, R2L and Probe) is selected. In the second level, four local feature vectors to determine the sub-class of each attack type are selected. Features are evaluated to measure its discrimination ability among classes. K-Means clustering algorithm is then used to cluster each class into two clusters. SFFS and ANN are used in hierarchical basis to select the relevant features and classify the query behavior to proper intrusion type. Experimental evaluation on NSL-KDD, a filtered version of the original KDD99 has shown that the proposed IDS can achieve good performance in terms of intrusions detection and recognition.

Scopus Clarivate Crossref
View Publication
Publication Date
Wed Jan 01 2020
Journal Name
Advances In Science, Technology And Engineering Systems Journal
Bayes Classification and Entropy Discretization of Large Datasets using Multi-Resolution Data Aggregation
...Show More Authors

Big data analysis has important applications in many areas such as sensor networks and connected healthcare. High volume and velocity of big data bring many challenges to data analysis. One possible solution is to summarize the data and provides a manageable data structure to hold a scalable summarization of data for efficient and effective analysis. This research extends our previous work on developing an effective technique to create, organize, access, and maintain summarization of big data and develops algorithms for Bayes classification and entropy discretization of large data sets using the multi-resolution data summarization structure. Bayes classification and data discretization play essential roles in many learning algorithms such a

... Show More
View Publication
Scopus Crossref
Publication Date
Sat Jun 06 2020
Journal Name
Journal Of The College Of Education For Women
Image classification with Deep Convolutional Neural Network Using Tensorflow and Transfer of Learning
...Show More Authors

The deep learning algorithm has recently achieved a lot of success, especially in the field of computer vision. This research aims to describe the classification method applied to the dataset of multiple types of images (Synthetic Aperture Radar (SAR) images and non-SAR images). In such a classification, transfer learning was used followed by fine-tuning methods. Besides, pre-trained architectures were used on the known image database ImageNet. The model VGG16 was indeed used as a feature extractor and a new classifier was trained based on extracted features.The input data mainly focused on the dataset consist of five classes including the SAR images class (houses) and the non-SAR images classes (Cats, Dogs, Horses, and Humans). The Conv

... Show More
View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Fri Sep 27 2024
Journal Name
Journal Of Applied Mathematics And Computational Mechanics
Fruit classification by assessing slice hardness based on RGB imaging. Case study: apple slices
...Show More Authors

Correct grading of apple slices can help ensure quality and improve the marketability of the final product, which can impact the overall development of the apple slice industry post-harvest. The study intends to employ the convolutional neural network (CNN) architectures of ResNet-18 and DenseNet-201 and classical machine learning (ML) classifiers such as Wide Neural Networks (WNN), Naïve Bayes (NB), and two kernels of support vector machines (SVM) to classify apple slices into different hardness classes based on their RGB values. Our research data showed that the DenseNet-201 features classified by the SVM-Cubic kernel had the highest accuracy and lowest standard deviation (SD) among all the methods we tested, at 89.51 %  1.66 %. This

... Show More
View Publication
Scopus Clarivate Crossref
Publication Date
Sat Dec 01 2012
Journal Name
Journal Of Engineering
Database for Baghdad Soil Using GIS Techniques
...Show More Authors

teen sites Baghdad are made. The sites are divided into two groups, one in Karkh and the other in Rusafa. Assessing the underground conditions can be occurred by drilling vertical holes called exploratory boring into the ground, obtaining soil (disturbed and undisturbed) samples, and testing these samples in a laboratory (civil engineering laboratory /University of Baghdad). From disturbed, the tests involved the grain size analysis and then classified the soil, Atterberg limit, chemical test (organic content, sulphate content, gypsum content and chloride content). From undisturbed samples, the test involved the consolidation test (from this test, the following parameters can be obtained: initial void ratio eo, compression index cc, swel

... Show More
View Publication Preview PDF
Crossref (6)
Crossref
Publication Date
Wed Mar 10 2021
Journal Name
Baghdad Science Journal
Detecting Textual Propaganda Using Machine Learning Techniques
...Show More Authors

Social Networking has dominated the whole world by providing a platform of information dissemination. Usually people share information without knowing its truthfulness. Nowadays Social Networks are used for gaining influence in many fields like in elections, advertisements etc. It is not surprising that social media has become a weapon for manipulating sentiments by spreading disinformation.  Propaganda is one of the systematic and deliberate attempts used for influencing people for the political, religious gains. In this research paper, efforts were made to classify Propagandist text from Non-Propagandist text using supervised machine learning algorithms. Data was collected from the news sources from July 2018-August 2018. After annota

... Show More
View Publication Preview PDF
Scopus (18)
Crossref (11)
Scopus Clarivate Crossref
Publication Date
Wed Nov 01 2017
Journal Name
Journal Of Economics And Administrative Sciences
Analytical Study Compared Between Poisson and Poisson Hierarchical Model and Applied in Healthy Field
...Show More Authors

Through this research, We have tried to evaluate the health programs and their effectiveness in improving the health situation through a study of the health institutions reality in Baghdad to identify the main reasons that affect the increase in maternal mortality by using two regression models, "Poisson's Regression Model" and "Hierarchical Poisson's Regression Model". And the study of that indicator (deaths) was through a comparison between the estimation methods of the used models. The "Maximum Likelihood" method was used to estimate the "Poisson's Regression Model"; whereas the "Full Maximum Likelihood" method were used for the "Hierarchical Poisson's Regression Model

... Show More
View Publication Preview PDF
Crossref
Publication Date
Mon Jan 01 2024
Journal Name
Fifth International Conference On Applied Sciences: Icas2023
Facial deepfake performance evaluation based on three detection tools: MTCNN, Dlib, and MediaPipe
...Show More Authors

View Publication
Scopus Crossref
Publication Date
Wed Dec 18 2019
Journal Name
Baghdad Science Journal
Eye Detection using Helmholtz Principle
...Show More Authors

            Eye Detection is used in many applications like pattern recognition, biometric, surveillance system and many other systems. In this paper, a new method is presented to detect and extract the overall shape of one eye from image depending on two principles Helmholtz & Gestalt. According to the principle of perception by Helmholz, any observed geometric shape is perceptually "meaningful" if its repetition number is very small in image with random distribution. To achieve this goal, Gestalt Principle states that humans see things either through grouping its similar elements or recognize patterns. In general, according to Gestalt Principle, humans see things through genera

... Show More
View Publication Preview PDF
Scopus (6)
Crossref (2)
Scopus Clarivate Crossref
Publication Date
Mon Jun 19 2023
Journal Name
Journal Of Engineering
Design and Implementation ofICT-Based Recycle-Rewarding System for Green Environment
...Show More Authors

This paper proposes a collaborative system called Recycle Rewarding System (RRS), and focuses on the aspect of using information communication technology (ICT) as a tool to promote greening. The idea behind RRS is to encourage recycling collectors by paying them for earning points. In doing so, both the industries and individuals reap the economical benefits of such system. Finally, and more importantly, the system intends to achieve a green environment for the Earth. This paper discusses the design and implementation of the RRS, involves: the architectural design, selection of components, and implementation issues. Five modules are used to construct the system, namely: database, data entry, points collecting and recording, points reward

... Show More
View Publication Preview PDF
Crossref (2)
Crossref
Publication Date
Mon Apr 01 2019
Journal Name
2019 International Conference On Automation, Computational And Technology Management (icactm)
Multi-Resolution Hierarchical Structure for Efficient Data Aggregation and Mining of Big Data
...Show More Authors

Big data analysis is essential for modern applications in areas such as healthcare, assistive technology, intelligent transportation, environment and climate monitoring. Traditional algorithms in data mining and machine learning do not scale well with data size. Mining and learning from big data need time and memory efficient techniques, albeit the cost of possible loss in accuracy. We have developed a data aggregation structure to summarize data with large number of instances and data generated from multiple data sources. Data are aggregated at multiple resolutions and resolution provides a trade-off between efficiency and accuracy. The structure is built once, updated incrementally, and serves as a common data input for multiple mining an

... Show More
View Publication
Scopus (3)
Crossref (2)
Scopus Crossref