Blockchain is an innovative technology that has gained interest in all sectors in the era of digital transformation where it manages transactions and saves them in a database. With the increasing financial transactions and the rapidly developed society with growing businesses many people looking for the dream of a better financially independent life, stray from large corporations and organizations to form startups and small businesses. Recently, the increasing demand for employees or institutes to prepare and manage contracts, papers, and the verifications process, in addition to human mistakes led to the emergence of a smart contract. The smart contract has been developed to save time and provide more confidence while dealing, as well as to cover the security aspects of digital management and to solve negotiation concerns. The smart contract was employed in creating a distributed ledger to eliminate the need for centralization. In this paper, a simple prototype has been implemented for the smart contract integrated with blockchain which is simulated in a local server with a set of nodes. Several security objectives, such as confidentiality, authorization, integrity, and non-repudiation, have been achieved in the proposed system. Besides, the paper discussed the importance of using the Blockchain technique, and how it contributed to the management of transactions in addition to how it was implemented in highly transparent real-estate scenarios. The smart contract was employed in creating a distributed ledger to eliminate the need for centralization. The elliptic-curve public key has been adopted as an alternative for the RSA in a signature generation/verification process and encryption protocol. For secure transactions, The Secure Socket Layer (SSL) also has been adopted as a secure layer in the web browser. The results have been investigated and evaluated from different aspects and the implementation was in a restricted environment. Experiments showed us the complexity of time and cost when using the (ECC) algorithm and using (RSA) algorithm depending on the size and length of the key. So if the size of the key in (ECC) equals (160) bits, and it corresponds to (1024) bits in (RSA), which is equivalent to 40% for (ECC) and 30% for (RSA). As a result, the (ECC) algorithm is complex, its key is smaller and the process of generating the key is faster, so it has achieved a high level of security.
In this paper, analyzing the non-dimensional Magnesium-hydrodynamics problem Using nanoparticles in Jeffrey-Hamel flow (JHF) has been studied. The fundamental equations for this issue are reduced to a three-order ordinary differential equation. The current project investigated the effect of the angles between the plates, Reynolds number, nanoparticles volume fraction parameter, and magnetic number on the velocity distribution by using analytical technique known as a perturbation iteration scheme (PIS). The effect of these parameters is similar in the converging and diverging channels except magnetic number that it is different in the divergent channel. Furthermore, the resulting solutions with good convergence and high accuracy for the d
... Show MoreInterface bonding between asphalt layers has been a topic of international investigation over the last thirty years. In this condition, a number of researchers have made their own techniques and used them to examine the characteristics of pavement interfaces. It is obvious that test findings won't always be comparable to the lack of a globally standard methodology for interface bonding. Also, several kinds of research have shown that factors like temperature, loading conditions, materials, and others have an impact on surface qualities. This study aims to solve this problem by thoroughly investigating interface bond testing that might serve as a basis for a uniform strategy. First, a general explanation of how the bonding strength
... Show MoreThe structure of the interrogation process in cross-examinations is said to be diverse and complex in terms of question-response typology. This is because the counsel has to extract truth from an opposing party’s witness whose views are expected to advocate that party's views regarding the case. Accordingly, the study which is basically quantitative in nature aims to investigate what the examining party intends to obtain out of these questions and which of these questions are the most prevalently used. It also aims to measure the amount of cooperativity in witnesses' responses. Accordingly, three transcripts of cross-examination have been analyzed, using a pragmatically-oriented approach. The approach draws on Stenstorm (1984) and Arch
... Show MoreDue to the large-scale development in satellite and network communication technologies, there is a significant demand for preserving the secure storage and transmission of the data over the internet and shared network environments. New challenges appeared that are related to the protection of critical and sensitive data
from illegal usage and unauthorized access. In this paper, we address the issues described above and develop new techniques to eliminate the associated problems. To achieve this, we propose a design of a new sensor node for tracking the location of cars and collecting all information and all visited locations by cars, followed by
encryption in a sensor node and saving in the database. A microcontroller of Arduino es
The Weibull distribution is considered one of the Type-I Generalized Extreme Value (GEV) distribution, and it plays a crucial role in modeling extreme events in various fields, such as hydrology, finance, and environmental sciences. Bayesian methods play a strong, decisive role in estimating the parameters of the GEV distribution due to their ability to incorporate prior knowledge and handle small sample sizes effectively. In this research, we compare several shrinkage Bayesian estimation methods based on the squared error and the linear exponential loss functions. They were adopted and compared by the Monte Carlo simulation method. The performance of these methods is assessed based on their accuracy and computational efficiency in estimati
... Show MorePorosity plays an essential role in petroleum engineering. It controls fluid storage in aquifers, connectivity of the pore structure control fluid flow through reservoir formations. To quantify the relationships between porosity, storage, transport and rock properties, however, the pore structure must be measured and quantitatively described. Porosity estimation of digital image utilizing image processing essential for the reservoir rock analysis since the sample 2D porosity briefly described. The regular procedure utilizes the binarization process, which uses the pixel value threshold to convert the color and grayscale images to binary images. The idea is to accommodate the blue regions entirely with pores and transform it to white in r
... Show MoreDeepfake is a type of artificial intelligence used to create convincing images, audio, and video hoaxes and it concerns celebrities and everyone because they are easy to manufacture. Deepfake are hard to recognize by people and current approaches, especially high-quality ones. As a defense against Deepfake techniques, various methods to detect Deepfake in images have been suggested. Most of them had limitations, like only working with one face in an image. The face has to be facing forward, with both eyes and the mouth open, depending on what part of the face they worked on. Other than that, a few focus on the impact of pre-processing steps on the detection accuracy of the models. This paper introduces a framework design focused on this asp
... Show MoreThe DEM (Digital elevation model) means that the topography of the earth's surface (such as; Terrain relief and ocean floors), can be described mathematically by elevations as functions of three positions either in geographical coordinates, (Lat. Long. System) or in rectangular coordinates systems (X, Y, Z). Therefore, a DEM is an array number that represents spatial distributions of terrain characteristics. In this paper, the contour lines with different interval of high-resolution digital elevation model (1m) for AL-khamisah, The Qar Government was obtained. The altitudes ranging is between 1 m – 8.5 m, so characterized by varying heights within a small spatial region because it represents in multiple spots with flat surfaces.
As they are the smallest functional parts of the muscle, motor units (MUs) are considered as the basic building blocks of the neuromuscular system. Monitoring MU recruitment, de-recruitment, and firing rate (by either invasive or surface techniques) leads to the understanding of motor control strategies and of their pathological alterations. EMG signal decomposition is the process of identification and classification of individual motor unit action potentials (MUAPs) in the interference pattern detected with either intramuscular or surface electrodes. Signal processing techniques were used in EMG signal decomposition to understand fundamental and physiological issues. Many techniques have been developed to decompose intramuscularly detec
... Show MoreRenal function tests are commonly used in clinical practice to look for renal disease, the most common includes the serum urea, uric acid and creatinine. Heart failure patients have a higher incidence of renal function test abnormalities than individuals who do not have heart failure disease. Fifty subjects of adults (male) were divided in to two groups, 25 subjects (healthy) as control (group1) and 25 subjects with heart failure (group 2). Our results indicate that serum uric acid, urea, and creatinine values were significantly elevated (P≤0.05) in patients group (2) compared with healthy group (1). The results also showed, the effect of age categories on uric acid blood urea nitrogen and creatinine values (P≤0.05) and there were no si
... Show More