The consensus algorithm is the core mechanism of blockchain and is used to ensure data consistency among blockchain nodes. The PBFT consensus algorithm is widely used in alliance chains because it is resistant to Byzantine errors. However, the present PBFT (Practical Byzantine Fault Tolerance) still has issues with master node selection that is random and complicated communication. The IBFT consensus technique, which is enhanced, is proposed in this study and is based on node trust value and BLS (Boneh-Lynn-Shacham) aggregate signature. In IBFT, multi-level indicators are used to calculate the trust value of each node, and some nodes are selected to take part in network consensus as a result of this calculation. The master node is chosen from among them based on which node has the highest trust value, it transforms the BLS signature process into the information interaction process between nodes. Consequently, communication complexity is reduced, and node-to-node information exchange remains secure. The simulation experiment findings demonstrate that the IBFT consensus method enhances transaction throughput rate by 61% and reduces latency by 13% when compared to the PBFT algorithm.
In the analysis of multiple linear regression, the problem of multicollinearity and auto-correlation drew the attention of many researchers, and given the appearance of these two problems together and their bad effect on the estimation, some of the researchers found new methods to address these two problems together at the same time. In this research a comparison for the performance of the Principal Components Two Parameter estimator (PCTP) and The (r-k) class estimator and the r-(k,d) class estimator by conducting a simulation study and through the results and under the mean square error (MSE) criterion to find the best way to address the two problems together. The results showed that the r-(k,d) class estimator is the best esti
... Show MoreIntended for getting good estimates with more accurate results, we must choose the appropriate method of estimation. Most of the equations in classical methods are linear equations and finding analytical solutions to such equations is very difficult. Some estimators are inefficient because of problems in solving these equations. In this paper, we will estimate the survival function of censored data by using one of the most important artificial intelligence algorithms that is called the genetic algorithm to get optimal estimates for parameters Weibull distribution with two parameters. This leads to optimal estimates of the survival function. The genetic algorithm is employed in the method of moment, the least squares method and the weighted
... Show MoreThe question of estimation took a great interest in some engineering, statistical applications, various applied, human sciences, the methods provided by it helped to identify and accurately the many random processes.
In this paper, methods were used through which the reliability function, risk function, and estimation of the distribution parameters were used, and the methods are (Moment Method, Maximum Likelihood Method), where an experimental study was conducted using a simulation method for the purpose of comparing the methods to show which of these methods are competent in practical application This is based on the observations generated from the Rayleigh logarithmic distribution (RL) with sample sizes
... Show MoreSupport vector machines (SVMs) are supervised learning models that analyze data for classification or regression. For classification, SVM is widely used by selecting an optimal hyperplane that separates two classes. SVM has very good accuracy and extremally robust comparing with some other classification methods such as logistics linear regression, random forest, k-nearest neighbor and naïve model. However, working with large datasets can cause many problems such as time-consuming and inefficient results. In this paper, the SVM has been modified by using a stochastic Gradient descent process. The modified method, stochastic gradient descent SVM (SGD-SVM), checked by using two simulation datasets. Since the classification of different ca
... Show More This research deals with the financial reporting for the non-current assets impairment from the viewpoint of international accounting standards, especially IAS 36 "Impairment of assets”. The research problem focused on the non-compliance with the requirements of IAS 36 which would negatively affect the accounting information quality, and its characteristics, especially the relevance of accounting information, that confirms the necessity of having such information for the three sub-characteristics in order to be useful for the decisions of users represented
CO2 Laser (10600nm) is the recent method in the management of challenging skin scar resulting from trauma, burn and surgical wound. The aim of this study was to evaluate the efficacy & safety of fractional CO2 laser (10600nm) in treatment of skin scar. Materials and Methods:Twenty patients with different types of scars treated with fractional CO2 (10600nm) laser, (10 patients) were given additional intralesional Triamcinolone. Results: All of the twenty patients included in this study showed some sort of improvements in scar texture, height and pliability and all of the ten patients who received intralesional Triamcinolone after laser show complete satisfaction. Conclusion:Fractional CO2 (10600nm) laser can be used as alternative, ef
... Show MoreThis research aims to create lightweight concrete mixtures containing waste from local sources, such as expanded polystyrene (EPS) beads and waste plastic fibers (WPFs), all are cheap or free in the Republic of Iraq and without charge. The modern, rigid, and mechanical properties of LWC were investigated, and the results were evaluated. Three mixtures were made, each with different proportions of plastic fibers (0.4%, 0.8%, 1.2%), in addition to a lightweight concrete mixture containing steak fibers (0.4%, 0.8%, 1.2%), in addition to a lightweight concrete mixture. It contains 20% EPS. The study found that the LWC caused by the addition of WPFs reduced the density (lightweight) of the concrete mixtures because EPS tends
... Show MoreThe article aims to study the liquidity that is required to be provided optimally and the profitability that is required to be achieved by the bank, and the impact of both of them on the value of the bank, and their effect of both liquidity and profitability on the value of the bank. Hence, the research problem emerged, which indicates the extent of the effect of liquidity and profitability on the value of the bank. The importance of the research stems from the main role that commercial banks play in the economy of a country. This requires the need to identify liquidity in a broad way and its most important components, and how to
... Show MoreValue Engineering is an analytical study on projects or services using a specific procedure and a multidisciplinary working group, works for the identification and classification of the project functions; either for a better perfuming of these functions or to lessen the total project cost or the two together. Value Engineering main aim is on finding innovative alternatives, without effecting the basic requirements of the project, its methodology based on the functional balancing between the three elements of production "performance, quality and cost". This methodology based on the "functional analysis", had shown high possibilities in solving any problem facing the production procedure , achieve better investment for available re
... Show More