Tor (The Onion Routing) network was designed to enable users to browse the Internet anonymously. It is known for its anonymity and privacy security feature against many agents who desire to observe the area of users or chase users’ browsing conventions. This anonymity stems from the encryption and decryption of Tor traffic. That is, the client’s traffic should be subject to encryption and decryption before the sending and receiving process, which leads to delay and even interruption in data flow. The exchange of cryptographic keys between network devices plays a pivotal and critical role in facilitating secure communication and ensuring the integrity of cryptographic procedures. This essential process is time-consuming, which causes delay and discontinuity of data flow. To overcome delay or interruption problems, we utilized the Software-Defined Network (SDN), Machine Learning (ML), and Blockchain (BC) techniques, which support the Tor network to intelligently speed up exchanging the public key via the proactive processing of the Tor network security management information. Consequently, the combination network (ITor-SDN) keeps data flow continuity to a Tor client. We simulated and emulated the proposed network by using Mininet and Shadow simulations. The findings of the performed analysis illustrate that the proposed network architecture enhances the overall performance metrics, showcasing a remarkable advancement of around 55%. This substantial enhancement is achieved through the seamless execution of the innovative ITor-SDN network combination approach.
Abstract
The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.
... Show More
Evaporation is one of the major components of the hydrological cycle in the nature, thus its accurate estimation is so important in the planning and management of the irrigation practices and to assess water availability and requirements. The aim of this study is to investigate the ability of fuzzy inference system for estimating monthly pan evaporation form meteorological data. The study has been carried out depending on 261 monthly measurements of each of temperature (T), relative humidity (RH), and wind speed (W) which have been available in Emara meteorological station, southern Iraq. Three different fuzzy models comprising various combinations of monthly climatic variables (temperature, wind speed, and relative humidity) were developed
... Show MoreShear wave velocity is an important feature in the seismic exploration that could be utilized in reservoir development strategy and characterization. Its vital applications in petrophysics, seismic, and geomechanics to predict rock elastic and inelastic properties are essential elements of good stability and fracturing orientation, identification of matrix mineral and gas-bearing formations. However, the shear wave velocity that is usually obtained from core analysis which is an expensive and time-consuming process and dipole sonic imager tool is not commonly available in all wells. In this study, a statistical method is presented to predict shear wave velocity from wireline log data. The model concentrated to predict shear wave velocity fr
... Show MoreWireless sensor applications are susceptible to energy constraints. Most of the energy is consumed in communication between wireless nodes. Clustering and data aggregation are the two widely used strategies for reducing energy usage and increasing the lifetime of wireless sensor networks. In target tracking applications, large amount of redundant data is produced regularly. Hence, deployment of effective data aggregation schemes is vital to eliminate data redundancy. This work aims to conduct a comparative study of various research approaches that employ clustering techniques for efficiently aggregating data in target tracking applications as selection of an appropriate clustering algorithm may reflect positive results in the data aggregati
... Show MoreThis research aims to clarify the importance of an accounting information system that uses artificial intelligence to detect earnings manipulation. The research problem stems from the widespread manipulation of earning in economic entities, especially at the local level, exacerbated by the high financial and administrative corruption rates in Iraq due to fraudulent accounting practices. Since earning manipulation involves intentional fraudulent acts, it is necessary to implement preventive measures to detect and deter such practices. The main hypothesis of the research assumes that an accounting information system based on artificial intelligence cannot effectively detect the manipulation of profits in Iraqi economic entities. The researche
... Show MoreInformation hiding strategies have recently gained popularity in a variety of fields. Digital audio, video, and images are increasingly being labelled with distinct but undetectable marks that may contain a hidden copyright notice or serial number, or even directly help to prevent unauthorized duplication. This approach is extended to medical images by hiding secret information in them using the structure of a different file format. The hidden information may be related to the patient. In this paper, a method for hiding secret information in DICOM images is proposed based on Discrete Wavelet Transform (DWT). Firstly. segmented all slices of a 3D-image into a specific block size and collecting the host image depend on a generated key
... Show MoreIn this study, we review the ARIMA (p, d, q), the EWMA and the DLM (dynamic linear moodelling) procedures in brief in order to accomdate the ac(autocorrelation) structure of data .We consider the recursive estimation and prediction algorithms based on Bayes and KF (Kalman filtering) techniques for correlated observations.We investigate the effect on the MSE of these procedures and compare them using generated data.
In this paper, integrated quantum neural network (QNN), which is a class of feedforward
neural networks (FFNN’s), is performed through emerging quantum computing (QC) with artificial neural network(ANN) classifier. It is used in data classification technique, and here iris flower data is used as a classification signals. For this purpose independent component analysis (ICA) is used as a feature extraction technique after normalization of these signals, the architecture of (QNN’s) has inherently built in fuzzy, hidden units of these networks (QNN’s) to develop quantized representations of sample information provided by the training data set in various graded levels of certainty. Experimental results presented here show that
... Show MoreIn this study, the mobile phone traces concern an ephemeral event which represents important densities of people. This research aims to study city pulse and human mobility evolution that would be arise during specific event (Armada festival), by modelling and simulating human mobility of the observed region, depending on CDRs (Call Detail Records) data. The most pivot questions of this research are: Why human mobility studied? What are the human life patterns in the observed region inside Rouen city during Armada festival? How life patterns and individuals' mobility could be extracted for this region from mobile DB (CDRs)? The radius of gyration parameter has been applied to elaborate human life patterns with regards to (work, off) days for
... Show MoreBlockchain technology relies on cryptographic techniques that provide various advantages, such as trustworthiness, collaboration, organization, identification, integrity, and transparency. Meanwhile, data analytics refers to the process of utilizing techniques to analyze big data and comprehend the relationships between data points to draw meaningful conclusions. The field of data analytics in Blockchain is relatively new, and few studies have been conducted to examine the challenges involved in Blockchain data analytics. This article presents a systematic analysis of how data analytics affects Blockchain performance, with the aim of investigating the current state of Blockchain-based data analytics techniques in research fields and
... Show More