This paper considers a new Double Integral transform called Double Sumudu-Elzaki transform DSET. The combining of the DSET with a semi-analytical method, namely the variational iteration method DSETVIM, to arrive numerical solution of nonlinear PDEs of Fractional Order derivatives. The proposed dual method property decreases the number of calculations required, so combining these two methods leads to calculating the solution's speed. The suggested technique is tested on four problems. The results demonstrated that solving these types of equations using the DSETVIM was more advantageous and efficient
The consensus algorithm is the core mechanism of blockchain and is used to ensure data consistency among blockchain nodes. The PBFT consensus algorithm is widely used in alliance chains because it is resistant to Byzantine errors. However, the present PBFT (Practical Byzantine Fault Tolerance) still has issues with master node selection that is random and complicated communication. The IBFT consensus technique, which is enhanced, is proposed in this study and is based on node trust value and BLS (Boneh-Lynn-Shacham) aggregate signature. In IBFT, multi-level indicators are used to calculate the trust value of each node, and some nodes are selected to take part in network consensus as a result of this calculation. The master node is chosen
... Show MoreIn many scientific fields, Bayesian models are commonly used in recent research. This research presents a new Bayesian model for estimating parameters and forecasting using the Gibbs sampler algorithm. Posterior distributions are generated using the inverse gamma distribution and the multivariate normal distribution as prior distributions. The new method was used to investigate and summaries Bayesian statistics' posterior distribution. The theory and derivation of the posterior distribution are explained in detail in this paper. The proposed approach is applied to three simulation datasets of 100, 300, and 500 sample sizes. Also, the procedure was extended to the real dataset called the rock intensity dataset. The actual dataset is collecte
... Show MoreABSTRACT
Naproxen(NPX) imprinted liquid electrodes of polymers are built using polymerization precipitation. The molecularly imprinted (MIP) and non imprinted (NIP) polymers were synthesized using NPX as a template. In the polymerization precipitation involved, styrene(STY) was used as monomer, N,N-methylenediacrylamide (N,N-MDAM) as a cross-linker and benzoyl peroxide (BPO) as an initiator. The molecularly imprinted membranes and the non-imprinted membranes were prepared using acetophenone(AOPH) and di octylphathalate(DOP)as plasticizers in PVC matrix. The slopes and detection limits of the liquid electrodes ranged from)-18.1,-17.72 (mV/decade and )4.0 x 10-
... Show MoreMost studies on deep beams have been made with reinforced concrete deep beams, only a few studies investigate the response of prestressed deep beams, while, to the best of our knowledge, there is not a study that investigates the response of full scale (T-section) prestressed deep beams with large web openings. An experimental and numerical study was conducted in order to investigate the shear strength of ordinary reinforced and partially prestressed full scale (T-section) deep beams that contain large web openings in order to investigate the prestressing existence effects on the deep beam responses and to better understand the effects of prestressing locations and opening depth to beam depth ratio on the deep beam performance and b
... Show MoreFlow-production systems whose pieces are connected in a row may not have maintenance scheduling procedures fixed because problems occur at different times (electricity plants, cement plants, water desalination plants). Contemporary software and artificial intelligence (AI) technologies are used to fulfill the research objectives by developing a predictive maintenance program. The data of the fifth thermal unit of the power station for the electricity of Al Dora/Baghdad are used in this study. Three stages of research were conducted. First, missing data without temporal sequences were processed. The data were filled using time series hour after hour and the times were filled as system working hours, making the volume of the data relativel
... Show MoreIn this paper, we designed a new efficient stream cipher cryptosystem that depend on a chaotic map to encrypt (decrypt) different types of digital images. The designed encryption system passed all basic efficiency criteria (like Randomness, MSE, PSNR, Histogram Analysis, and Key Space) that were applied to the key extracted from the random generator as well as to the digital images after completing the encryption process.