With the freedom offered by the Deep Web, people have the opportunity to express themselves freely and discretely, and sadly, this is one of the reasons why people carry out illicit activities there. In this work, a novel dataset for Dark Web active domains known as crawler-DB is presented. To build the crawler-DB, the Onion Routing Network (Tor) was sampled, and then a web crawler capable of crawling into links was built. The link addresses that are gathered by the crawler are then classified automatically into five classes. The algorithm built in this study demonstrated good performance as it achieved an accuracy of 85%. A popular text representation method was used with the proposed crawler-DB crossed by two different supervised classifiers to facilitate the categorization of the Tor concealed services. The results of the experiments conducted in this study show that using the Term Frequency-Inverse Document Frequency (TF-IDF) word representation with a linear support vector classifier achieves 91% of 5 folds cross-validation accuracy when classifying a subset of illegal activities from crawler-DB, while the accuracy of Naïve Bayes was 80.6%. The good performance of the linear SVC might support potential tools to help the authorities in the detection of these activities. Moreover, outcomes are expected to be significant in both practical and theoretical aspects, and they may pave the way for further research.
Several correlations have been proposed for bubble point pressure, however, the correlations could not predict bubble point pressure accurately over the wide range of operating conditions. This study presents Artificial Neural Network (ANN) model for predicting the bubble point pressure especially for oil fields in Iraq. The most affecting parameters were used as the input layer to the network. Those were reservoir temperature, oil gravity, solution gas-oil ratio and gas relative density. The model was developed using 104 real data points collected from Iraqi reservoirs. The data was divided into two groups: the first was used to train the ANN model, and the second was used to test the model to evaluate their accuracy and trend stability
... Show More
Cutting forces are important factors for determining machine serviceability and product quality. Factors such as speed feed, depth of cut and tool noise radius affect on surface roughness and cutting forces in turning operation. The artificial neural network model was used to predict cutting forces with related to inputs including cutting speed (m/min), feed rate (mm/rev), depth of cut (mm) and work piece hardness (Map). The outputs of the ANN model are the machined cutting force parameters, the neural network showed that all (outputs) of all components of the processing force cutting force FT (N), feed force FA (N) and radial force FR (N) perfect accordance with the experimental data. Twenty-five samp
... Show MoreThe railways network is one of the huge infrastructure projects. Therefore, dealing with these projects such as analyzing and developing should be done using appropriate tools, i.e. GIS tools. Because, traditional methods will consume resources, time, money and the results maybe not accurate. In this research, the train stations in all of Iraq’s provinces were studied and analyzed using network analysis, which is one of the most powerful techniques within GIS. A free trial copy of ArcGIS®10.2 software was used in this research in order to achieve the aim of this study. The analysis of current train stations has been done depending on the road network, because people used roads to reach those train stations. The data layers for this st
... Show MoreAn application of neural network technique was introduced in modeling the point efficiency of sieve tray, based on a
data bank of around 33l data points collected from the open literature.Two models proposed,using back-propagation
algorithm, the first model network consists: volumetric liquid flow rate (QL), F foctor for gas (FS), liquid density (pL),
gas density (pg), liquid viscosity (pL), gas viscosity (pg), hole diameter (dH), weir height (hw), pressure (P) and surface
tension between liquid phase and gas phase (o). In the second network, there are six parameters as dimensionless
group: Flowfactor (F), Reynolds number for liquid (ReL), Reynolds number for gas through hole (Reg), ratio of weir
height to hole diqmeter
In modern era, which requires the use of networks in the transmission of data across distances, the transport or storage of such data is required to be safe. The protection methods are developed to ensure data security. New schemes are proposed that merge crypto graphical principles with other systems to enhance information security. Chaos maps are one of interesting systems which are merged with cryptography for better encryption performance. Biometrics is considered an effective element in many access security systems. In this paper, two systems which are fingerprint biometrics and chaos logistic map are combined in the encryption of a text message to produce strong cipher that can withstand many types of attacks. The histogram analysis o
... Show More
Nanomaterials have an excellent potential for improving the rheological and tribological properties of lubricating oil. In this study, oleic acid was used to surface-modify nanoparticles to enhance the dispersion and stability of Nanofluid. The surface modification was conducted for inorganic nanoparticles (NPs) TiO₂ and CuO with oleic acid (OA) surfactant, where oleic acid could render the surface of TiO2-CuO hydrophobic. Fourier transform infrared spectroscopy (FTIR), and Scanning electron microscopy (SEM) were used to characterize the surface modification of NPs. The main objective of this study was to investigate the influence of adding modified TiO₂-CuO NPs with weight ratio 1:1 on thermal-physical propertie
... Show MoreFeature selection (FS) constitutes a series of processes used to decide which relevant features/attributes to include and which irrelevant features to exclude for predictive modeling. It is a crucial task that aids machine learning classifiers in reducing error rates, computation time, overfitting, and improving classification accuracy. It has demonstrated its efficacy in myriads of domains, ranging from its use for text classification (TC), text mining, and image recognition. While there are many traditional FS methods, recent research efforts have been devoted to applying metaheuristic algorithms as FS techniques for the TC task. However, there are few literature reviews concerning TC. Therefore, a comprehensive overview was systematicall
... Show MoreWith the spread use of internet, especially the web of social media, an unusual quantity of information is found that includes a number of study fields such as psychology, entertainment, sociology, business, news, politics, and other cultural fields of nations. Data mining methodologies that deal with social media allows producing enjoyable scene on the human behaviour and interaction. This paper demonstrates the application and precision of sentiment analysis using traditional feedforward and two of recurrent neural networks (gated recurrent unit (GRU) and long short term memory (LSTM)) to find the differences between them. In order to test the system’s performance, a set of tests is applied on two public datasets. The firs
... Show MoreEuphorbia lateriflora is a popular traditional medicinal plant whose leaves are used in Africa, especially Nigeria, to treat wounds and many diseases. This study investigated the preliminary phytochemical constituents, secondary metabolites by High-Performance Liquid Chromatography “HPLC” technique, and antimicrobial potentials (Minimum Inhibitory Concentration “MIC”, Minimum Fungicidal Concentration “MFC” and disc diffusion assay) of various concentrations (100 mg/mL, 50 mg/ml, and 25 mg/mL) of the solvents (ethyl acetate and n-hexane) extracts of E. lateriflora against Staphylococcus aureus, Escherichia coli and Candida albicans. The phytochemical screening revealed that
... Show MoreThis article presents a polynomial-based image compression scheme, which consists of using the color model (YUV) to represent color contents and using two-dimensional polynomial coding (first-order) with variable block size according to correlation between neighbor pixels. The residual part of the polynomial for all bands is analyzed into two parts, most important (big) part, and least important (small) parts. Due to the significant subjective importance of the big group; lossless compression (based on Run-Length spatial coding) is used to represent it. Furthermore, a lossy compression system scheme is utilized to approximately represent the small group; it is based on an error-limited adaptive coding system and using the transform codin
... Show More