With the freedom offered by the Deep Web, people have the opportunity to express themselves freely and discretely, and sadly, this is one of the reasons why people carry out illicit activities there. In this work, a novel dataset for Dark Web active domains known as crawler-DB is presented. To build the crawler-DB, the Onion Routing Network (Tor) was sampled, and then a web crawler capable of crawling into links was built. The link addresses that are gathered by the crawler are then classified automatically into five classes. The algorithm built in this study demonstrated good performance as it achieved an accuracy of 85%. A popular text representation method was used with the proposed crawler-DB crossed by two different supervised classifiers to facilitate the categorization of the Tor concealed services. The results of the experiments conducted in this study show that using the Term Frequency-Inverse Document Frequency (TF-IDF) word representation with a linear support vector classifier achieves 91% of 5 folds cross-validation accuracy when classifying a subset of illegal activities from crawler-DB, while the accuracy of Naïve Bayes was 80.6%. The good performance of the linear SVC might support potential tools to help the authorities in the detection of these activities. Moreover, outcomes are expected to be significant in both practical and theoretical aspects, and they may pave the way for further research.
The objective of this work is to design and implement a cryptography system that enables the sender to send message through any channel (even if this channel is insecure) and the receiver to decrypt the received message without allowing any intruder to break the system and extracting the secret information. In this work, we implement an interaction between the feedforward neural network and the stream cipher, so the secret message will be encrypted by unsupervised neural network method in addition to the first encryption process which is performed by the stream cipher method. The security of any cipher system depends on the security of the related keys (that are used by the encryption and the decryption processes) and their corresponding le
... Show MoreImage retrieval is used in searching for images from images database. In this paper, content – based image retrieval (CBIR) using four feature extraction techniques has been achieved. The four techniques are colored histogram features technique, properties features technique, gray level co- occurrence matrix (GLCM) statistical features technique and hybrid technique. The features are extracted from the data base images and query (test) images in order to find the similarity measure. The similarity-based matching is very important in CBIR, so, three types of similarity measure are used, normalized Mahalanobis distance, Euclidean distance and Manhattan distance. A comparison between them has been implemented. From the results, it is conclud
... Show MoreIn cognitive radio system, the spectrum sensing has a major challenge in needing a sensing method, which has a high detection capability with reduced complexity. In this paper, a low-cost hybrid spectrum sensing method with an optimized detection performance based on energy and cyclostationary detectors is proposed. The method is designed such that at high signal-to-noise ratio SNR values, energy detector is used alone to perform the detection. At low SNR values, cyclostationary detector with reduced complexity may be employed to support the accurate detection. The complexity reduction is done in two ways: through reducing the number of sensing samples used in the autocorrelation process in the time domain and through using the Slid
... Show MoreThis paper proposes a new methodology for improving network security by introducing an optimised hybrid intrusion detection system (IDS) framework solution as a middle layer between the end devices. It considers the difficulty of updating databases to uncover new threats that plague firewalls and detection systems, in addition to big data challenges. The proposed framework introduces a supervised network IDS based on a deep learning technique of convolutional neural networks (CNN) using the UNSW-NB15 dataset. It implements recursive feature elimination (RFE) with extreme gradient boosting (XGB) to reduce resource and time consumption. Additionally, it reduces bias toward
... Show MoreThe aim of this paper is to design suitable neural network (ANN) as an alternative accurate tool to evaluate concentration of Copper in contaminated soils. First, sixteen (4x4) soil samples were harvested from a phytoremediated contaminated site located in Baghdad city in Iraq. Second, a series of measurements were performed on the soil samples. Third, design an ANN and its performance was evaluated using a test data set and then applied to estimate the concentration of Copper. The performance of the ANN technique was compared with the traditional laboratory inspecting using the training and test data sets. The results of this study show that the ANN technique trained on experimental measurements can be successfully applied to the rapid est
... Show MoreModern ciphers are one of the more difficult to break cipher systems because these ciphers high security, high speed, non - propagation error and difficulty in breaking it. One of the most important weaknesses of stream cipher is a matching or correlation between the output key-stream and the output of shift registers.
This work considers new investigation methods for cryptanalysis stream cipher using ciphertext only attack depending on Particle Swarm Optimization (PSO) for the automatic extraction for the key. It also introduces a cryptanalysis system based on PSO with suggestion for enhancement of the performance of PSO, by using Simulated Annealing (SA). Additionally, it presents a comparison for the cryptanal
... Show More