With the freedom offered by the Deep Web, people have the opportunity to express themselves freely and discretely, and sadly, this is one of the reasons why people carry out illicit activities there. In this work, a novel dataset for Dark Web active domains known as crawler-DB is presented. To build the crawler-DB, the Onion Routing Network (Tor) was sampled, and then a web crawler capable of crawling into links was built. The link addresses that are gathered by the crawler are then classified automatically into five classes. The algorithm built in this study demonstrated good performance as it achieved an accuracy of 85%. A popular text representation method was used with the proposed crawler-DB crossed by two different supervised classifiers to facilitate the categorization of the Tor concealed services. The results of the experiments conducted in this study show that using the Term Frequency-Inverse Document Frequency (TF-IDF) word representation with a linear support vector classifier achieves 91% of 5 folds cross-validation accuracy when classifying a subset of illegal activities from crawler-DB, while the accuracy of Naïve Bayes was 80.6%. The good performance of the linear SVC might support potential tools to help the authorities in the detection of these activities. Moreover, outcomes are expected to be significant in both practical and theoretical aspects, and they may pave the way for further research.
In this work , a hybrid scheme tor Arabic speech for the recognition
of the speaker verification is presented . The scheme is hybrid as utilizes the traditional digi tal signal processi ng and neural network . Kohonen neural network has been used as a recognizer tor speaker verification after extract spectral features from an acoustic signal by Fast Fourier Transformation Algorithm(FFT) .
The system was im plemented using a PENTIUM processor , I000
MHZ compatible and MS-dos 6.2 .
The Next-generation networks, such as 5G and 6G, need capacity and requirements for low latency, and high dependability. According to experts, one of the most important features of (5 and 6) G networks is network slicing. To enhance the Quality of Service (QoS), network operators may now operate many instances on the same infrastructure due to configuring able slicing QoS. Each virtualized network resource, such as connection bandwidth, buffer size, and computing functions, may have a varied number of virtualized network resources. Because network resources are limited, virtual resources of the slices must be carefully coordinated to meet the different QoS requirements of users and services. These networks may be modifie
... Show MoreWellbore instability is one of the major issues observed throughout the drilling operation. Various wellbore instability issues may occur during drilling operations, including tight holes, borehole collapse, stuck pipe, and shale caving. Rock failure criteria are important in geomechanical analysis since they predict shear and tensile failures. A suitable failure criterion must match the rock failure, which a caliper log can detect to estimate the optimal mud weight. Lack of data makes certain wells' caliper logs unavailable. This makes it difficult to validate the performance of each failure criterion. This paper proposes an approach for predicting the breakout zones in the Nasiriyah oil field using an artificial neural network. It
... Show MoreIt is so much noticeable that initialization of architectural parameters has a great impact on whole learnability stream so that knowing mathematical properties of dataset results in providing neural network architecture a better expressivity and capacity. In this paper, five random samples of the Volve field dataset were taken. Then a training set was specified and the persistent homology of the dataset was calculated to show impact of data complexity on selection of multilayer perceptron regressor (MLPR) architecture. By using the proposed method that provides a well-rounded strategy to compute data complexity. Our method is a compound algorithm composed of the t-SNE method, alpha-complexity algorithm, and a persistence barcod
... Show MoreImage Fusion Using A Convolutional Neural Network
Keywords provide the reader with a summary of the contents of the document and play a significant role in information retrieval systems, especially in search engine optimization and bibliographic databases. Furthermore keywords help to classify the document into the related topic. Keywords extraction included manual extracting depends on the content of the document or article and the judgment of its author. Manual extracting of keywords is costly, consumes effort and time, and error probability. In this research an automatic Arabic keywords extraction model based on deep learning algorithms is proposed. The model consists of three main steps: preprocessing, feature extraction and classification to classify the document
... Show MoreIn this research Artificial Neural Network (ANN) technique was applied to study the filtration process in water treatment. Eight models have been developed and tested using data from a pilot filtration plant, working under different process design criteria; influent turbidity, bed depth, grain size, filtration rate and running time (length of the filtration run), recording effluent turbidity and head losses. The ANN models were constructed for the prediction of different performance criteria in the filtration process: effluent turbidity, head losses and running time. The results indicate that it is quite possible to use artificial neural networks in predicting effluent turbidity, head losses and running time in the filtration process, wi
... Show MoreAd-Hoc Networks are a generation of networks that are truly wireless, and can be easily constructed without any operator. There are protocols for management of these networks, in which the effectiveness and the important elements in these networks are the Quality of Service (QoS). In this work the evaluation of QoS performance of MANETs is done by comparing the results of using AODV, DSR, OLSR and TORA routing protocols using the Op-Net Modeler, then conduct an extensive set of performance experiments for these protocols with a wide variety of settings. The results show that the best protocol depends on QoS using two types of applications (+ve and –ve QoS in the FIS evaluation). QoS of the protocol varies from one prot
... Show MoreFeature selection represents one of the critical processes in machine learning (ML). The fundamental aim of the problem of feature selection is to maintain performance accuracy while reducing the dimension of feature selection. Different approaches were created for classifying the datasets. In a range of optimization problems, swarming techniques produced better outcomes. At the same time, hybrid algorithms have gotten a lot of attention recently when it comes to solving optimization problems. As a result, this study provides a thorough assessment of the literature on feature selection problems using hybrid swarm algorithms that have been developed over time (2018-2021). Lastly, when compared with current feature selection procedu
... Show More