Tor (The Onion Routing) network was designed to enable users to browse the Internet anonymously. It is known for its anonymity and privacy security feature against many agents who desire to observe the area of users or chase users’ browsing conventions. This anonymity stems from the encryption and decryption of Tor traffic. That is, the client’s traffic should be subject to encryption and decryption before the sending and receiving process, which leads to delay and even interruption in data flow. The exchange of cryptographic keys between network devices plays a pivotal and critical role in facilitating secure communication and ensuring the integrity of cryptographic procedures. This essential process is time-consuming, which causes delay and discontinuity of data flow. To overcome delay or interruption problems, we utilized the Software-Defined Network (SDN), Machine Learning (ML), and Blockchain (BC) techniques, which support the Tor network to intelligently speed up exchanging the public key via the proactive processing of the Tor network security management information. Consequently, the combination network (ITor-SDN) keeps data flow continuity to a Tor client. We simulated and emulated the proposed network by using Mininet and Shadow simulations. The findings of the performed analysis illustrate that the proposed network architecture enhances the overall performance metrics, showcasing a remarkable advancement of around 55%. This substantial enhancement is achieved through the seamless execution of the innovative ITor-SDN network combination approach.
Abstract
The net profit reported in the annual financial statements of the companies listed in the financial markets, is considered one of the Sources of information relied upon by users of accounting information in making their investment decisions. At the same time be relied upon in calculating the bonus (Incentives) granted to management, therefore the management of companies to manipulate those numbers in order to increase those bonuses associated to earnings, This practices are called earnings management practices. the manipulation in the figures of earnings by management will mislead the users of financial statements who depend on reported earnings in their deci
... Show MoreIn its theoretical framework, this study dealt with the subjects of high commitment management and organizational excellence, as the study came in response to the growing developments and changes in the fields of management. It includes an analysis of correlation and effect between high commitment management, which has been attracting a lot of attention recently due to the intensifying rivalry between organizations because of certain external factors like globalization and world markets liberation, and its effect in achieving organizational excellence.
The practical framework, on the other hand, dealt with the analysis of correlation and effect between the study's variables. The problem
... Show MoreThe distribution of the intensity of the comet Ison C/2013 is studied by taking its histogram. This distribution reveals four distinct regions that related to the background, tail, coma and nucleus. One dimensional temperature distribution fitting is achieved by using two mathematical equations that related to the coordinate of the center of the comet. The quiver plot of the gradient of the comet shows very clearly that arrows headed towards the maximum intensity of the comet.
Cloud storage provides scalable and low cost resources featuring economies of scale based on cross-user architecture. As the amount of data outsourced grows explosively, data deduplication, a technique that eliminates data redundancy, becomes essential. The most important cloud service is data storage. In order to protect the privacy of data owner, data are stored in cloud in an encrypted form. However, encrypted data introduce new challenges for cloud data deduplication, which becomes crucial for data storage. Traditional deduplication schemes cannot work on encrypted data. Existing solutions of encrypted data deduplication suffer from security weakness. This paper proposes a combined compressive sensing and video deduplication to maximize
... Show MoreThis paper addresses the nature of Spatial Data Infrastructure (SDI), considered as one of the most important concepts to ensure effective functioning in a modern society. It comprises a set of continually developing methods and procedures providing the geospatial base supporting a country’s governmental, environmental, economic, and social activities. In general, the SDI framework consists of the integration of various elements including standards, policies, networks, data, and end users and application areas. The transformation of previously paper-based map data into a digital format, the emergence of GIS, and the Internet and a host of online applications (e.g., environmental impact analysis, navigation, applications of VGI dat
... Show MoreA skip list data structure is really just a simulation of a binary search tree. Skip lists algorithm are simpler, faster and use less space. this data structure conceptually uses parallel sorted linked lists. Searching in a skip list is more difficult than searching in a regular sorted linked list. Because a skip list is a two dimensional data structure, it is implemented using a two dimensional network of nodes with four pointers. the implementation of the search, insert and delete operation taking a time of upto . The skip list could be modified to implement the order statistic operations of RANKand SEARCH BY RANK while maintaining the same expected time. Keywords:skip list , parallel linked list , randomized algorithm , rank.
In many scientific fields, Bayesian models are commonly used in recent research. This research presents a new Bayesian model for estimating parameters and forecasting using the Gibbs sampler algorithm. Posterior distributions are generated using the inverse gamma distribution and the multivariate normal distribution as prior distributions. The new method was used to investigate and summaries Bayesian statistics' posterior distribution. The theory and derivation of the posterior distribution are explained in detail in this paper. The proposed approach is applied to three simulation datasets of 100, 300, and 500 sample sizes. Also, the procedure was extended to the real dataset called the rock intensity dataset. The actual dataset is collecte
... Show MoreToday, the science of artificial intelligence has become one of the most important sciences in creating intelligent computer programs that simulate the human mind. The goal of artificial intelligence in the medical field is to assist doctors and health care workers in diagnosing diseases and clinical treatment, reducing the rate of medical error, and saving lives of citizens. The main and widely used technologies are expert systems, machine learning and big data. In the article, a brief overview of the three mentioned techniques will be provided to make it easier for readers to understand these techniques and their importance.
In this paper, image compression technique is presented based on the Zonal transform method. The DCT, Walsh, and Hadamard transform techniques are also implements. These different transforms are applied on SAR images using Different block size. The effects of implementing these different transforms are investigated. The main shortcoming associated with this radar imagery system is the presence of the speckle noise, which affected the compression results.