The consensus algorithm is the core mechanism of blockchain and is used to ensure data consistency among blockchain nodes. The PBFT consensus algorithm is widely used in alliance chains because it is resistant to Byzantine errors. However, the present PBFT (Practical Byzantine Fault Tolerance) still has issues with master node selection that is random and complicated communication. The IBFT consensus technique, which is enhanced, is proposed in this study and is based on node trust value and BLS (Boneh-Lynn-Shacham) aggregate signature. In IBFT, multi-level indicators are used to calculate the trust value of each node, and some nodes are selected to take part in network consensus as a result of this calculation. The master node is chosen from among them based on which node has the highest trust value, it transforms the BLS signature process into the information interaction process between nodes. Consequently, communication complexity is reduced, and node-to-node information exchange remains secure. The simulation experiment findings demonstrate that the IBFT consensus method enhances transaction throughput rate by 61% and reduces latency by 13% when compared to the PBFT algorithm.
The rapid and enormous growth of the Internet of Things, as well as its widespread adoption, has resulted in the production of massive quantities of data that must be processed and sent to the cloud, but the delay in processing the data and the time it takes to send it to the cloud has resulted in the emergence of fog, a new generation of cloud in which the fog serves as an extension of cloud services at the edge of the network, reducing latency and traffic. The distribution of computational resources to minimize makespan and running costs is one of the disadvantages of fog computing. This paper provides a new approach for improving the task scheduling problem in a Cloud-Fog environme
In the last few years, the Internet of Things (IoT) is gaining remarkable attention in both academic and industrial worlds. The main goal of the IoT is laying on describing everyday objects with different capabilities in an interconnected fashion to the Internet to share resources and to carry out the assigned tasks. Most of the IoT objects are heterogeneous in terms of the amount of energy, processing ability, memory storage, etc. However, one of the most important challenges facing the IoT networks is the energy-efficient task allocation. An efficient task allocation protocol in the IoT network should ensure the fair and efficient distribution of resources for all objects to collaborate dynamically with limited energy. The canonic
... Show MoreIn this research weights, which are used, are estimated using General Least Square Estimation to estimate simple linear regression parameters when the depended variable, which is used, consists of two classes attributes variable (for Heteroscedastic problem) depending on Sequential Bayesian Approach instead of the Classical approach used before, Bayes approach provides the mechanism of tackling observations one by one in a sequential way, i .e each new observation will add a new piece of information for estimating the parameter of probability estimation of certain phenomenon of Bernoulli trials who research the depended variable in simple regression linear equation. in addition to the information deduced from the past exper
... Show Moreالسياسة الروسية في الشرق الاوسط الكبير او (فن اقامة علاقات الصداقة مع كل دول العالم)
Wireless sensor networks (WSNs) represent one of the key technologies in internet of things (IoTs) networks. Since WSNs have finite energy sources, there is ongoing research work to develop new strategies for minimizing power consumption or enhancing traditional techniques. In this paper, a novel Gaussian mixture models (GMMs) algorithm is proposed for mobile wireless sensor networks (MWSNs) for energy saving. Performance evaluation of the clustering process with the GMM algorithm shows a remarkable energy saving in the network of up to 92%. In addition, a comparison with another clustering strategy that uses the K-means algorithm has been made, and the developed method has outperformed K-means with superior performance, saving ener
... Show MoreABSTRACT Background: Tuberculosis is a worldwide infectious disease in spite of advancement in health care system. Tuberculous lymphadenitis is the most prevalent form of extra pulmonary tuberculosis with predilection of cervical lymph nodes. Objectives: To evaluate the reliability of grey scale ultrasonography together with color Doppler in the diagnosis of cervical tuberculous lymph adenitis and evaluation of early therapeutic response. Subjects and methods:From July 2015 to May 2016 in Al-Karama teaching hospital /Kut city- Wasit-Iraq, 25 patients (14 males and 11 females) with ages range from (6-50) years. Ultrasonography examination was done for all patients and grey scale criteria (distribution, size, shape, echogenicity, echogenic hi
... Show MoreBackground/Aim: Endometrial abnormalities represent a diagnostic challenge due to overlapping imaging features with normal endometrium. Aim of this study was to assess accuracy of dynamic contrast-enhanced and diffusion-weighted magnetic resonance imaging (MRI) in evaluation of endometrial lesions in comparison with T2 and to assess local staging validity and degree of myometrial invasion in malignancy. Methods: Forty patients with abnormal vaginal bleeding or sonographic thickened endometrial were recruited. MRI examination of pelvis was per-formed using 1.5 T scanner with a pelvic array coil. Conventional T1-and T2, dynamic contrast-enhanced (DCE) sequences and diffusion-weighted image (DWI) were performed. Results: Mean age of pa
... Show MoreSurvival analysis is the analysis of data that are in the form of times from the origin of time until the occurrence of the end event, and in medical research, the origin of time is the date of registration of the individual or the patient in a study such as clinical trials to compare two types of medicine or more if the endpoint It is the death of the patient or the disappearance of the individual. The data resulting from this process is called survival times. But if the end is not death, the resulting data is called time data until the event. That is, survival analysis is one of the statistical steps and procedures for analyzing data when the adopted variable is time to event and time. It could be d
... Show MoreMaterial obtained from the demolition of concrete structures and milling of flexible pavements has the highest potential for recyclability. This study aimed to evaluate the performance of hot mix asphalt with the concurrent use of recycled asphalt pavement (RAP) and recycled concrete aggregate (RCA). Contents of RAP and RCA were varied from 0% to 50% by fixing the total recycling materials percentage to 50%. Penetration grade 40/50 virgin binder and waste engine oil (WEO) as rejuvenator were used in the present study. A series of tests, such as Scanning electron microscopy (SEM), Marshall stability, indirect tensile strength test, IDEAL CT, uniaxial compression test, and resilient modulus test, were carried out to assess the performance of
... Show MoreAim: to determine the effectiveness of women's self-care instructions on their post cesarean section care in Baghdad
teaching hospital.
Methodology: The present study used quasi-experimental study design in maternity words in Baghdad teaching
hospital. The sample was collected and follow up for the period (15) January 2014 until 15 May 2014 Nonprobability
(purposive sample) of (100) women post cesarean section divided in to two groups (50) women post
cesarean section considered as a study group, and another (50) women post cesarean section considered as the
control one, A questionnaire designed as a tool to collect data fit the purpose of the study a questionnaire include
demographic variables, Reproductive variables