Tor (The Onion Routing) network was designed to enable users to browse the Internet anonymously. It is known for its anonymity and privacy security feature against many agents who desire to observe the area of users or chase users’ browsing conventions. This anonymity stems from the encryption and decryption of Tor traffic. That is, the client’s traffic should be subject to encryption and decryption before the sending and receiving process, which leads to delay and even interruption in data flow. The exchange of cryptographic keys between network devices plays a pivotal and critical role in facilitating secure communication and ensuring the integrity of cryptographic procedures. This essential process is time-consuming, which causes delay and discontinuity of data flow. To overcome delay or interruption problems, we utilized the Software-Defined Network (SDN), Machine Learning (ML), and Blockchain (BC) techniques, which support the Tor network to intelligently speed up exchanging the public key via the proactive processing of the Tor network security management information. Consequently, the combination network (ITor-SDN) keeps data flow continuity to a Tor client. We simulated and emulated the proposed network by using Mininet and Shadow simulations. The findings of the performed analysis illustrate that the proposed network architecture enhances the overall performance metrics, showcasing a remarkable advancement of around 55%. This substantial enhancement is achieved through the seamless execution of the innovative ITor-SDN network combination approach.
Learning the vocabulary of a language has great impact on acquiring that language. Many scholars in the field of language learning emphasize the importance of vocabulary as part of the learner's communicative competence, considering it the heart of language. One of the best methods of learning vocabulary is to focus on those words of high frequency. The present article is a corpus based approach to the study of vocabulary whereby the research data are analyzed quantitatively using the software program "AntWordprofiler". This program analyses new input research data in terms of already stored reliable corpora. The aim of this article is to find out whether the vocabularies used in the English textbook for Intermediate Schools in Iraq are con
... Show MoreProgression in Computer networks and emerging of new technologies in this field helps to find out new protocols and frameworks that provides new computer network-based services. E-government services, a modernized version of conventional government, are created through the steady evolution of technology in addition to the growing need of societies for numerous services. Government services are deeply related to citizens’ daily lives; therefore, it is important to evolve with technological developments—it is necessary to move from the traditional methods of managing government work to cutting-edge technical approaches that improve the effectiveness of government systems for providing services to citizens. Blockchain technology is amon
... Show MoreThe objective of an Optimal Power Flow (OPF) algorithm is to find steady state operation point which minimizes generation cost, loss etc. while maintaining an acceptable system performance in terms of limits on generators real and reactive powers, line flow limits etc. The OPF solution includes an objective function. A common objective function concerns the active power generation cost. A Linear programming method is proposed to solve the OPF problem. The Linear Programming (LP) approach transforms the nonlinear optimization problem into an iterative algorithm that in each iteration solves a linear optimization problem resulting from linearization both the objective function and constrains. A computer program, written in MATLAB environme
... Show MoreThe evolution of the Internet of things (IoT) led to connect billions of heterogeneous physical devices together to improve the quality of human life by collecting data from their environment. However, there is a need to store huge data in big storage and high computational capabilities. Cloud computing can be used to store big data. The data of IoT devices is transferred using two types of protocols: Message Queuing Telemetry Transport (MQTT) and Hypertext Transfer Protocol (HTTP). This paper aims to make a high performance and more reliable system through efficient use of resources. Thus, load balancing in cloud computing is used to dynamically distribute the workload across nodes to avoid overloading any individual r
... Show MorePower-electronic converters are essential elements for the effective interconnection of renewable energy sources to the power grid, as well as to include energy storage units, vehicle charging stations, microgrids, etc. Converter models that provide an accurate representation of their wideband operation and interconnection with other active and passive grid components and systems are necessary for reliable steady state and transient analyses during normal or abnormal grid operating conditions. This paper introduces two Laplace domain-based approaches to model buck and boost DC-DC converters for electromagnetic transient studies. The first approach is an analytical one, where the converter is represented by a two-port admittance model via mo
... Show MoreA geographic information system (GIS) is a very effective management and analysis tool. Geographic locations rely on data. The use of artificial neural networks (ANNs) for the interpretation of natural resource data has been shown to be beneficial. Back-propagation neural networks are one of the most widespread and prevalent designs. The combination of geographic information systems with artificial neural networks provides a method for decreasing the cost of landscape change studies by shortening the time required to evaluate data. Numerous designs and kinds of ANNs have been created; the majority of them are PC-based service domains. Using the ArcGIS Network Analyst add-on, you can locate service regions around any network
... Show MoreFuture generations of wireless communications systems are expected to evolve toward allowing massive ubiquitous connectivity and achieving ultra-reliable and low-latency communications (URLLC) with extremely high data rates. Massive multiple-input multiple-output (m-MIMO) is a crucial transmission technique to fulfill the demands of high data rates in the upcoming wireless systems. However, obtaining a downlink (DL) training sequence (TS) that is feasible for fast channel estimation, i.e., meeting the low-latency communications required by future generations of wireless systems, in m-MIMO with frequency-division-duplex (FDD) when users have different channel correlations is very challenging. Therefore, a low-complexity solution for
... Show More