The last two decades have seen a marked increase in the illegal activities on the Dark Web. Prompt evolvement and use of sophisticated protocols make it difficult for security agencies to identify and investigate these activities by conventional methods. Moreover, tracing criminals and terrorists poses a great challenge keeping in mind that cybercrimes are no less serious than real life crimes. At the same time, computer security societies and law enforcement pay a great deal of attention on detecting and monitoring illegal sites on the Dark Web. Retrieval of relevant information is not an easy task because of vastness and ever-changing nature of the Dark Web; as a result, web crawlers play a vital role in achieving this task. Thereafter, data mining techniques are applied to extract useful patterns that would help security agencies to limit and get rid of cybercrimes. The aim of this paper is to present a survey for those researchers who are interested in this topic. We started by discussing the internet layers and the properties of the Deep Web, followed by explaining the technical characters of The Onion Routing (TOR) network, and finally describing the approaches of accessing, extracting and processing Dark Web data. Understanding the Dark Web, its properties and its threats is vital for internet servers; we do hope this paper be of help in that goal.
A hand gesture recognition system provides a robust and innovative solution to nonverbal communication through human–computer interaction. Deep learning models have excellent potential for usage in recognition applications. To overcome related issues, most previous studies have proposed new model architectures or have fine-tuned pre-trained models. Furthermore, these studies relied on one standard dataset for both training and testing. Thus, the accuracy of these studies is reasonable. Unlike these works, the current study investigates two deep learning models with intermediate layers to recognize static hand gesture images. Both models were tested on different datasets, adjusted to suit the dataset, and then trained under different m
... Show MoreThis study aims to develop a recommendation engine methodology to enhance the model’s effectiveness and efficiency. The proposed model is commonly used to assign or propose a limited number of developers with the required skills and expertise to address and resolve a bug report. Managing collections within bug repositories is the responsibility of software engineers in addressing specific defects. Identifying the optimal allocation of personnel to activities is challenging when dealing with software defects, which necessitates a substantial workforce of developers. Analyzing new scientific methodologies to enhance comprehension of the results is the purpose of this analysis. Additionally, developer priorities were discussed, especially th
... Show MoreCloud Computing is a mass platform to serve high volume data from multi-devices and numerous technologies. Cloud tenants have a high demand to access their data faster without any disruptions. Therefore, cloud providers are struggling to ensure every individual data is secured and always accessible. Hence, an appropriate replication strategy capable of selecting essential data is required in cloud replication environments as the solution. This paper proposed a Crucial File Selection Strategy (CFSS) to address poor response time in a cloud replication environment. A cloud simulator called CloudSim is used to conduct the necessary experiments, and results are presented to evidence the enhancement on replication performance. The obtained an
... Show MoreThis research is focused on an interpretive of 2D seismic data to study is reinterpreting seismic data by applying sufficient software (Petrel 2017) of the area between Al-Razzazah Lake and the Euphrates river belonging to Karbala'a and Al-Anbar Governorates, central Iraq. The delineation of the sub-surface structural features and evaluation of the structure of Najmah and Zubair Formations was done. The structure interpretation showed that the studied area was affected by normal fault bearing (NW-SE) direction with a small displacement. In contrast, time and depth maps showed monocline structures (nose structures) located in the western part of the studied area.
In this study, a chaotic method is proposed that generates S-boxes similar to AES S-boxes with the help of a private key belonging to
In this study, dynamic encryption techniques are explored as an image cipher method to generate S-boxes similar to AES S-boxes with the help of a private key belonging to the user and enable images to be encrypted or decrypted using S-boxes. This study consists of two stages: the dynamic generation of the S-box method and the encryption-decryption method. S-boxes should have a non-linear structure, and for this reason, K/DSA (Knutt Durstenfeld Shuffle Algorithm), which is one of the pseudo-random techniques, is used to generate S-boxes dynamically. The biggest advantage of this approach is the produ
... Show MoreThis research aims to study the structural analysis of the 2D reflection seismic data for the Judaida subsurface structure located in Kirkuk province, northern Iraq. It is located 60 Km southwest of Kirkuk oil field, and 35 Km southwest of Jambur oil field, the Daquq River passes through the study area. The reflectors in the seismic section were picked and identified by using the synthetic seismograms generated from the logs data of the Jd-1 well. Three main seismic reflectors, Fatha, Jeribe, and the Euphrates were chosen. These mentioned sedimentary formations were deposited during the Middle Miocene, Lower Miocene, and Early-Mid Miocene respectively. Time and depth maps were drawn for these three reflectors by processing average data f
... Show MoreIn this research we solved numerically Boltzmann transport equation in order to calculate the transport parameters, such as, drift velocity, W, D/? (ratio of diffusion coefficient to the mobility) and momentum transfer collision frequency ?m, for purpose of determination of magnetic drift velocity WM and magnetic deflection coefficient ? for low energy electrons, that moves in the electric field E, crossed with magnetic field B, i.e; E×B, in the nitrogen, Argon, Helium and it's gases mixtures as a function of: E/N (ratio of electric field strength to the number density of gas), E/P300 (ratio of electric field strength to the gas pressure) and D/? which covered a different ranges for E/P300 at temperatures 300°k (Kelvin). The results show
... Show MoreIt has increasingly been recognised that the future developments in geospatial data handling will centre on geospatial data on the web: Volunteered Geographic Information (VGI). The evaluation of VGI data quality, including positional and shape similarity, has become a recurrent subject in the scientific literature in the last ten years. The OpenStreetMap (OSM) project is the most popular one of the leading platforms of VGI datasets. It is an online geospatial database to produce and supply free editable geospatial datasets for a worldwide. The goal of this paper is to present a comprehensive overview of the quality assurance of OSM data. In addition, the credibility of open source geospatial data is discussed, highlight
... Show MoreData centric techniques, like data aggregation via modified algorithm based on fuzzy clustering algorithm with voronoi diagram which is called modified Voronoi Fuzzy Clustering Algorithm (VFCA) is presented in this paper. In the modified algorithm, the sensed area divided into number of voronoi cells by applying voronoi diagram, these cells are clustered by a fuzzy C-means method (FCM) to reduce the transmission distance. Then an appropriate cluster head (CH) for each cluster is elected. Three parameters are used for this election process, the energy, distance between CH and its neighbor sensors and packet loss values. Furthermore, data aggregation is employed in each CH to reduce the amount of data transmission which le
... Show MoreIn this paper, time spent and the repetition of using the Social Network Sites (SNS) in Android applications are investigated. In this approach, we seek to raise the awareness and limit, but not eliminate the repeated uses of SNS, by introducing AndroidTrack. This AndroidTrack is an android application that was designed to monitor and apply valid experimental studies in order to improve the impacts of social media on Iraqi users. Data generated from the app were aggregated and updated periodically at Google Firebase Real-time Database. The statistical factor analysis (FA) was presented as a result of the user’s interactions.