The consensus algorithm is the core mechanism of blockchain and is used to ensure data consistency among blockchain nodes. The PBFT consensus algorithm is widely used in alliance chains because it is resistant to Byzantine errors. However, the present PBFT (Practical Byzantine Fault Tolerance) still has issues with master node selection that is random and complicated communication. The IBFT consensus technique, which is enhanced, is proposed in this study and is based on node trust value and BLS (Boneh-Lynn-Shacham) aggregate signature. In IBFT, multi-level indicators are used to calculate the trust value of each node, and some nodes are selected to take part in network consensus as a result of this calculation. The master node is chosen from among them based on which node has the highest trust value, it transforms the BLS signature process into the information interaction process between nodes. Consequently, communication complexity is reduced, and node-to-node information exchange remains secure. The simulation experiment findings demonstrate that the IBFT consensus method enhances transaction throughput rate by 61% and reduces latency by 13% when compared to the PBFT algorithm.
The research explain the developments in the structure of government Expenditure for the period (1990-2014), this period include tow different periods in terms of the conditions, the first period (1990-2002)characterized by imposing the economic sanctions and deny the Iraqi economy from the oil revenues, while the second period (2003-2014) marked by abundance resource rents as a result of lifting the ban on oil exports, (autoregressive Distributed lag Model) has been used to measure the impact of government Expenditure in both side current and investment in the oil-GDP (gross domestic product) and non oil-GDP, the stady found that there is no significant relationship between current Expenditure in non-oil and oil-GDP in bo
... Show MoreLoad balancing in computer networks is one of the most subjects that has got researcher's attention in the last decade. Load balancing will lead to reduce processing time and memory usage that are the most two concerns of the network companies in now days, and they are the most two factors that determine if the approach is worthy applicable or not. There are two kinds of load balancing, distributing jobs among other servers before processing starts and stays at that server to the end of the process is called static load balancing, and moving jobs during processing is called dynamic load balancing. In this research, two algorithms are designed and implemented, the History Usage (HU) algorithm that statically balances the load of a Loaded
... Show MoreThe investor needs to a clear strategy for the purpose of access to the financial market, that is, has a plan to increase The share of the profits thinking entrepreneur and new, and highlights the importance of this in that it sets for the investor when it goes to the market, and when it comes out of it, and at what price to buy or sell the stock, and what is the the amount of money it starts. Fortunately, he does not need to invent his own investment strategy, because over the years the development of effective methods of buying and selling, and once you understand how to work these methods investor can choose the most appropriate methods and adapted image that fit his style investment .
&nb
... Show MoreAbstract
Objective of this research focused on testing the impact of internal corporate governance instruments in the management of working capital and the reflection of each of them on the Firm performance. For this purpose, four main hypotheses was formulated, the first, pointed out its results to a significant effect for each of corporate major shareholders ownership and Board of Directors size on the net working capital and their association with a positive relation. The second, explained a significant effect of net working capital on the economic value added, and their link inverse relationship, while the third, explored a significant effect for each of the corporate major shareholders ownershi
... Show MoreIn this research, the methods of Kernel estimator (nonparametric density estimator) were relied upon in estimating the two-response logistic regression, where the comparison was used between the method of Nadaraya-Watson and the method of Local Scoring algorithm, and optimal Smoothing parameter λ was estimated by the methods of Cross-validation and generalized Cross-validation, bandwidth optimal λ has a clear effect in the estimation process. It also has a key role in smoothing the curve as it approaches the real curve, and the goal of using the Kernel estimator is to modify the observations so that we can obtain estimators with characteristics close to the properties of real parameters, and based on medical data for patients with chro
... Show MoreThis research studies the effect of adding micro, nano and hybrid by ratio (1:1) of (Al2O3,TiO2) to epoxy resin on thermal conductivity before and after immersion in HCl acid for (14 day) with normality (0.3 N) at weight fraction (0.02, 0.04, 0.06, 0.08) and thickness (6mm). The results of thermal conductivity reveled that epoxy reinforced by (Al2O3) and mixture (TiO2+Al2O3) increases with increasing the weight fraction, but the thermal conductivity (k) a values for micro and Nano (TiO2) decrease with increasing the weight fraction of reinforced, while the immersion in acidic solution (HCl) that the (k) values after immersion more than the value in before immersion.