With the freedom offered by the Deep Web, people have the opportunity to express themselves freely and discretely, and sadly, this is one of the reasons why people carry out illicit activities there. In this work, a novel dataset for Dark Web active domains known as crawler-DB is presented. To build the crawler-DB, the Onion Routing Network (Tor) was sampled, and then a web crawler capable of crawling into links was built. The link addresses that are gathered by the crawler are then classified automatically into five classes. The algorithm built in this study demonstrated good performance as it achieved an accuracy of 85%. A popular text representation method was used with the proposed crawler-DB crossed by two different supervised classifiers to facilitate the categorization of the Tor concealed services. The results of the experiments conducted in this study show that using the Term Frequency-Inverse Document Frequency (TF-IDF) word representation with a linear support vector classifier achieves 91% of 5 folds cross-validation accuracy when classifying a subset of illegal activities from crawler-DB, while the accuracy of Naïve Bayes was 80.6%. The good performance of the linear SVC might support potential tools to help the authorities in the detection of these activities. Moreover, outcomes are expected to be significant in both practical and theoretical aspects, and they may pave the way for further research.
Selective recovery of atropine from Datura innoxia seeds was studied. Applying pertraction in a rotating film contactor (RFC) the alkaloid was successfully recovered from native aqueous extracts obtained from the plant seeds. Decane as a liquid membrane and sulfuric acid as a stripping agent were used. Pertraction from native liquid extracts provided also a good atropine refinement, since the most of co-extracted from the plant species remained in the feed or membrane solution. Solid–liquid extraction of atropine from Datura innoxia seeds was coupled with RF-pertraction in order to purify simultaneously the extract obtained from the plant. Applying the integrated process, proposed in this study, a product containing 92.6% atropine was
... Show MoreIn this work, results from an optical technique (laser speckle technique) for measuring surface roughness was done by using statistical properties of speckle pattern from the point of view of computer image texture analysis. Four calibration relationships were used to cover wide range of measurement with the same laser speckle technique. The first one is based on intensity contrast of the speckle, the second is based on analysis of speckle binary image, the third is on size of speckle pattern spot, and the latest one is based on characterization of the energy feature of the gray level co-occurrence matrices for the speckle pattern. By these calibration relationships surface roughness of an object surface can be evaluated within the
... Show MoreOne of the troublesome duties in chemical industrial units is determining the instantaneous drop size distribution, which is created between two immiscible liquids within such units. In this work a complete system for measuring instantaneous droplet size is constructed. It consists of laser detection system (1mW He-Ne laser), drop generation system (turbine mixer unit), and microphotography system. Two immiscible liquids, water and kerosene were mixed together with different low volume fractions (0.0025, 0.02) of kerosene (as a dispersed phase) in water (as a continuous phase). The experiments were carried out at different rotational speed (1180- 2090 r.p.m) of the turbine mixer. The Sauter mean diameter of the drops was determined by la
... Show MoreDrought is one of the most natural hazards that may harm human life and property under different weather and environmental conditions. This study used remote sensing data to monitor agricultural and meteorological drought in Babel Governorate. Drought maps were drawn using Landsat 8 images based on Normalized Difference Vegetation Index (NDVI) for 2015, 2018, and 2021. The meteorological drought was assessed using a standardized precipitation index (SPI 12) for the same years. The results showed that the SPI-12 indicated near-normal drought types in 2015 and 2018, whereas SPI values showed a lower value in 2021. Two drought categories were identified which were moderate drought and severe drought. The NDVI results showed tha
... Show MoreUncompressed form of the digital images are needed a very large storage capacity amount, as a consequence requires large communication bandwidth for data transmission over the network. Image compression techniques not only minimize the image storage space but also preserve the quality of image. This paper reveal image compression technique which uses distinct image coding scheme based on wavelet transform that combined effective types of compression algorithms for further compression. EZW and SPIHT algorithms are types of significant compression techniques that obtainable for lossy image compression algorithms. The EZW coding is a worthwhile and simple efficient algorithm. SPIHT is an most powerful technique that utilize for image
... Show More
The effective surface area of drug particle is increased by a reduction in the particle size. Since dissolution takes place at the surface of the solute, the larger the surface area, the further rapid is the rate of drug dissolution. Ketoprofen is class II type drug according to (Biopharmaceutics Classification System BCS) with low solubility and high permeability. The aim of this investigation was to increase the solubility and hence the dissolution rate by the preparation of ketoprofen nanosuspension using solvent evaporation method. Materials like PVP K30, poloxamer 188, HPMC E5, HPMC E15, HPMC E50, Tween 80 were used as stabilizers in perpetration of differ
... Show MoreEstimating the semantic similarity between short texts plays an increasingly prominent role in many fields related to text mining and natural language processing applications, especially with the large increase in the volume of textual data that is produced daily. Traditional approaches for calculating the degree of similarity between two texts, based on the words they share, do not perform well with short texts because two similar texts may be written in different terms by employing synonyms. As a result, short texts should be semantically compared. In this paper, a semantic similarity measurement method between texts is presented which combines knowledge-based and corpus-based semantic information to build a semantic network that repre
... Show MoreWith the development of high-speed network technologies, there has been a recent rise in the transfer of significant amounts of sensitive data across the Internet and other open channels. The data will be encrypted using the same key for both Triple Data Encryption Standard (TDES) and Advanced Encryption Standard (AES), with block cipher modes called cipher Block Chaining (CBC) and Electronic CodeBook (ECB). Block ciphers are often used for secure data storage in fixed hard drives, portable devices, and safe network data transport. Therefore, to assess the security of the encryption method, it is necessary to become familiar with and evaluate the algorithms of cryptographic systems. Block cipher users need to be sure that the ciphers the
... Show MoreBiometrics represent the most practical method for swiftly and reliably verifying and identifying individuals based on their unique biological traits. This study addresses the increasing demand for dependable biometric identification systems by introducing an efficient approach to automatically recognize ear patterns using Convolutional Neural Networks (CNNs). Despite the widespread adoption of facial recognition technologies, the distinct features and consistency inherent in ear patterns provide a compelling alternative for biometric applications. Employing CNNs in our research automates the identification process, enhancing accuracy and adaptability across various ear shapes and orientations. The ear, being visible and easily captured in
... Show More