Tor (The Onion Routing) network was designed to enable users to browse the Internet anonymously. It is known for its anonymity and privacy security feature against many agents who desire to observe the area of users or chase users’ browsing conventions. This anonymity stems from the encryption and decryption of Tor traffic. That is, the client’s traffic should be subject to encryption and decryption before the sending and receiving process, which leads to delay and even interruption in data flow. The exchange of cryptographic keys between network devices plays a pivotal and critical role in facilitating secure communication and ensuring the integrity of cryptographic procedures. This essential process is time-consuming, which causes delay and discontinuity of data flow. To overcome delay or interruption problems, we utilized the Software-Defined Network (SDN), Machine Learning (ML), and Blockchain (BC) techniques, which support the Tor network to intelligently speed up exchanging the public key via the proactive processing of the Tor network security management information. Consequently, the combination network (ITor-SDN) keeps data flow continuity to a Tor client. We simulated and emulated the proposed network by using Mininet and Shadow simulations. The findings of the performed analysis illustrate that the proposed network architecture enhances the overall performance metrics, showcasing a remarkable advancement of around 55%. This substantial enhancement is achieved through the seamless execution of the innovative ITor-SDN network combination approach.
The majority of systems dealing with natural language processing (NLP) and artificial intelligence (AI) can assist in making automated and automatically-supported decisions. However, these systems may face challenges and difficulties or find it confusing to identify the required information (characterization) for eliciting a decision by extracting or summarizing relevant information from large text documents or colossal content. When obtaining these documents online, for instance from social networking or social media, these sites undergo a remarkable increase in the textual content. The main objective of the present study is to conduct a survey and show the latest developments about the implementation of text-mining techniqu
... Show MoreMachine learning has a significant advantage for many difficulties in the oil and gas industry, especially when it comes to resolving complex challenges in reservoir characterization. Permeability is one of the most difficult petrophysical parameters to predict using conventional logging techniques. Clarifications of the work flow methodology are presented alongside comprehensive models in this study. The purpose of this study is to provide a more robust technique for predicting permeability; previous studies on the Bazirgan field have attempted to do so, but their estimates have been vague, and the methods they give are obsolete and do not make any concessions to the real or rigid in order to solve the permeability computation. To
... Show MoreElliptic Curve Cryptography (ECC) is one of the public key cryptosystems that works based on the algebraic models in the form of elliptic curves. Usually, in ECC to implement the encryption, the encoding of data must be carried out on the elliptic curve, which seems to be a preprocessing step. Similarly, after the decryption a post processing step must be conducted for mapping or decoding the corresponding data to the exact point on the elliptic curves. The Memory Mapping (MM) and Koblitz Encoding (KE) are the commonly used encoding models. But both encoding models have drawbacks as the MM needs more memory for processing and the KE needs more computational resources. To overcome these issues the proposed enhanced Koblitz encodi
... Show MoreIn this research، a comparison has been made between the robust estimators of (M) for the Cubic Smoothing Splines technique، to avoid the problem of abnormality in data or contamination of error، and the traditional estimation method of Cubic Smoothing Splines technique by using two criteria of differentiation which are (MADE، WASE) for different sample sizes and disparity levels to estimate the chronologically different coefficients functions for the balanced longitudinal data which are characterized by observations obtained through (n) from the independent subjects، each one of them is measured repeatedly by group of specific time points (m)،since the frequent measurements within the subjects are almost connected an
... Show MoreIntroduction: Methadone hydrochloride (MDN) is an effective pharmacological substitution treatment for opioids dependence, adopted in different countries as methadone maintenance treatment (MMT) programmes. However, MDN can exacerbate the addiction problem if it is abused and injected intravenously, and the frequent visits to the MMT centres can reduce patient compliance. The overall aim of this study is to develop a novel extended-release capsule of MDN using the sol-gel silica (SGS) technique that has the potential to counteract medication-tampering techniques and associated health risks and reduce the frequent visits to MMT centres. Methods: For MDN recrystallisation, a closed container method (CCM) and hot-stage method (HSM) were conduc
... Show MoreFor many years, the construction industry damages have been overlooked such as unreasonable consumption of resources in addition to producing a lot of construction waste but with global awareness growth towards the sustainable development issues, the sustainable construction practices have been adopted, taking into account the environment and human safety. The research aims to propose a management system for construction practices which could be adopted during constructing different types of sustainable buildings besides formulating flowcharts which clarify the required whole phases of sustainable buildings life cycle. The research includes two parts: theoretical part which generally ,handles the sustainability concepts at construction i
... Show More
XML is being incorporated into the foundation of E-business data applications. This paper addresses the problem of the freeform information that stored in any organization and how XML with using this new approach will make the operation of the search very efficient and time consuming. This paper introduces new solution and methodology that has been developed to capture and manage such unstructured freeform information (multi information) depending on the use of XML schema technologies, neural network idea and object oriented relational database, in order to provide a practical solution for efficiently management multi freeform information system.
Interface evaluation has been the subject of extensive study and research in human-computer interaction (HCI). It is a crucial tool for promoting the idea that user engagement with computers should resemble casual conversations and interactions between individuals, according to specialists in the field. Researchers in the HCI field initially focused on making various computer interfaces more usable, thus improving the user experience. This study's objectives were to evaluate and enhance the user interface of the University of Baghdad's implementation of an online academic management system using the effectiveness, time-based efficiency, and satisfaction rates that comply with the task questionnaire process. We made a variety of interfaces f
... Show MoreThis research focused on clarifying the relationship strategic decisions for operations management & performance excellence organizational, The research emerges from a problem which explained by many application questions. Special questionnaire has been prepared for this purpose distributed (72) to sample of management levels (Top, middle) in the General company for mining industries and aquatic Insullation & the General company of batteries industry, The research has tried to test a number hypotheses related to the relation and regression among the variables of the research, and the differences among the <
... Show More