The consensus algorithm is the core mechanism of blockchain and is used to ensure data consistency among blockchain nodes. The PBFT consensus algorithm is widely used in alliance chains because it is resistant to Byzantine errors. However, the present PBFT (Practical Byzantine Fault Tolerance) still has issues with master node selection that is random and complicated communication. The IBFT consensus technique, which is enhanced, is proposed in this study and is based on node trust value and BLS (Boneh-Lynn-Shacham) aggregate signature. In IBFT, multi-level indicators are used to calculate the trust value of each node, and some nodes are selected to take part in network consensus as a result of this calculation. The master node is chosen from among them based on which node has the highest trust value, it transforms the BLS signature process into the information interaction process between nodes. Consequently, communication complexity is reduced, and node-to-node information exchange remains secure. The simulation experiment findings demonstrate that the IBFT consensus method enhances transaction throughput rate by 61% and reduces latency by 13% when compared to the PBFT algorithm.
Abstract
This research aims to study and improve the passivating specifications of rubber resistant to vibration. In this paper, seven different rubber recipes were prepared based on mixtures of natural rubber(NR) as an essential part in addition to the synthetic rubber (IIR, BRcis, SBR, CR)with different rates. Mechanical tests such as tensile strength, hardness, friction, resistance to compression, fatigue and creep testing in addition to the rheological test were performed. Furthermore, scanning electron microscopy (SEM)test was used to examine the structure morphology of rubber. After studying and analyzing the results, we found that, recipe containing (BRcis) of 40% from th
... Show MoreThe Research examines the transmission advantage from Floor Trading (FT) to the Electronic Trading (ET) in the Iraqi Stock Exchange (ISE). Testing three hypothesis, first, test the significant different of market depth before and after period of ET used, second, test the significant different of market liquidity also before and after period of ET used. And third test the impact of market depth and liquidity on the performance of ISE. AnEvent Study is depended with 74 observing distributed equality on research period which is extent among 2006 to 2012, Note that the event window is 5-7-2009.The Result of hypothesis testing explore that the all three null main hypothesis is refusing and accept the alternative of it's because the ET
... Show Moreيعد التأمين أحد المؤسسات المالية ذات التأثير على برامج التنمية ، وما زال قطاع التأمين يعاني من العديد من المشاكل منها الوعي أو التشريع أو كيفية توظيفه كمورد اقتصادي واجتماعي خدمة للمجتمع والمؤسسات والافراد. وحيث أن التأمين يقدم خدمة تستوجب التزامها بمواصفات محددة ، لذا أصبح من الضروري دراسة آليات تحسين جودة الخدمة التأمينية وكذلك دراسة واقع العملية التسويقية للخدمة التأمينية وما ينالها من تحديات داخلية أ
... Show MoreIn this study, genetic algorithm was used to predict the reaction kinetics of Iraqi heavy naphtha catalytic reforming process located in Al-Doura refinery in Baghdad. One-dimensional steady state model was derived to describe commercial catalytic reforming unit consisting of four catalytic reforming reactors in series process.
The experimental information (Reformate composition and output temperature) for each four reactors collected at different operating conditions was used to predict the parameters of the proposed kinetic model. The kinetic model involving 24 components, 1 to 11 carbon atoms for paraffins and 6 to 11 carbon atom for naphthenes and aromatics with 71 reactions. The pre-exponential Arrhenius constants and a
... Show MoreBackground: Diabetes and hypertension are related to cardiovascular risk factors and are possible to detect development of atherosclerosis in cardiovascular system, were can predict their effect and measurement by ultrasound and Doppler study. These risk factors included increased intima-media thickness, resistive index (RI) and pulsatility index (PI) of the right common carotid arteries. Method: We studied 20 patients with diabetes and hypertension, and 20 patients with diabetes only, were examine right carotid arteries for these two groups. In this sample we studied the Lumen diameter of the Rt. carotid arteries, Intima – media thickness (IMT), peak systolic velocity, end diastolic velocity, and Pulsatility index, Resistance index were
... Show MoreThe aim of this paper is to find a new method for solving a system of linear initial value problems of ordinary differential equation using approximation technique by two-point osculatory interpolation with the fit equal numbers of derivatives at the end points of an interval [0, 1] and compared the results with conventional methods and is shown to be that seems to converge faster and more accurately than the conventional methods.
The research dealt with the design of the cost accounting system for the transport service and its Role in improving the efficiency of pricing decisions through the application of the cost system based on ABC activities. The main activities were defined and cost guides were to measure the cost of each service and to determine the cost of each service for the purpose of providing management with appropriate information and pricing decisions The problem of research in the lack of adoption by some public companies in the service sector on the cost accounting system to calculate the cost of service as well as the lack of identification of productive activities and service activities and therefore cannot make the appropriate decision t
... Show Moreconventional FCM algorithm does not fully utilize the spatial information in the image. In this research, we use a FCM algorithm that incorporates spatial information into the membership function for clustering. The spatial function is the summation of the membership functions in the neighborhood of each pixel under consideration. The advantages of the method are that it is less
sensitive to noise than other techniques, and it yields regions more homogeneous than those of other methods. This technique is a powerful method for noisy image segmentation.