Frequent data in weather records is essential for forecasting, numerical model development, and research, but data recording interruptions may occur for various reasons. So, this study aims to find a way to treat these missing data and know their accuracy by comparing them with the original data values. The mean method was used to treat daily and monthly missing temperature data. The results show that treating the monthly temperature data for the stations (Baghdad, Hilla, Basra, Nasiriya, and Samawa) in Iraq for all periods (1980-2020), the percentage for matching between the original and the treating values did not exceed (80%). So, the period was divided into four periods. It was noted that most of the congruence values increased, reached in summer (70%-100%), and decreased somewhat in winter. While the daily treatment using the mean method for the stations Baghdad and Basra (2010-2020), it turns out that most of the congruence values in the summer ranged (70%-100%), but in winter, the congruence is often decreased. Therefore, this method gives high accuracy when treating monthly and daily temperatures in summer and less in winter.
Cryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms
... Show MoreIn this study, mean free path and positron elastic-inelastic scattering are modeled for the elements hydrogen (H), carbon (C), nitrogen (N), oxygen (O), phosphorus (P), sulfur (S), chlorine (Cl), potassium (K) and iodine (I). Despite the enormous amounts of data required, the Monte Carlo (MC) method was applied, allowing for a very accurate simulation of positron interaction collisions in live cells. Here, the MC simulation of the interaction of positrons was reported with breast, liver, and thyroid at normal incidence angles, with energies ranging from 45 eV to 0.2 MeV. The model provides a straightforward analytic formula for the random sampling of positron scattering. ICRU44 was used to compile the elemental composition data. In this
... Show MoreBlockchain technology relies on cryptographic techniques that provide various advantages, such as trustworthiness, collaboration, organization, identification, integrity, and transparency. Meanwhile, data analytics refers to the process of utilizing techniques to analyze big data and comprehend the relationships between data points to draw meaningful conclusions. The field of data analytics in Blockchain is relatively new, and few studies have been conducted to examine the challenges involved in Blockchain data analytics. This article presents a systematic analysis of how data analytics affects Blockchain performance, with the aim of investigating the current state of Blockchain-based data analytics techniques in research fields and
... Show MoreSimple method has been used to determine the absence of heavy metals in an aqueous solution. Fluorescein was used as the base colorimetric materialThis was doped with CuCl2 and the final solution showeda clear change in color. This change was correlated with the change in both pH and electrical conductivity of the solution. The optical property as an obvious change of the spectra was observed. Therefore, this simple method could be proposed as a method to detectheavy metals in any solution.
The main aim of this paper is to study how the different estimators of the two unknown parameters (shape and scale parameter) of a generalized exponential distribution behave for different sample sizes and for different parameter values. In particular,
. Maximum Likelihood, Percentile and Ordinary Least Square estimators had been implemented for different sample sizes (small, medium, and large) and assumed several contrasts initial values for the two parameters. Two indicators of performance Mean Square Error and Mean Percentile Error were used and the comparisons were carried out between different methods of estimation by using monte carlo simulation technique .. It was obse
... Show Morein this paper the collocation method will be solve ordinary differential equations of retarted arguments also some examples are presented in order to illustrate this approach
In the present research, a crane frame has been investigated by using finite element method. The damage is simulated by reducing the stiffness of assumed elements with ratios (10% and 20 %) in mid- span of the vertical column in crane frame. The cracked beam with a one-edge and non-propagating crack has been used. Six cases of damage are modeled for crane frame and by introducing cracked elements at different locations with ratio of depth of crack to the height of the beam (a/h) 0.1, 0.20. A FEM program coded in Matlab 6.5 was used to model the numerical simulation of the damage scenarios. The results showed a decreasing in the five natural frequencies from undamaged beam which means
... Show MoreThe primary objective of this paper is to improve a biometric authentication and classification model using the ear as a distinct part of the face since it is unchanged with time and unaffected by facial expressions. The proposed model is a new scenario for enhancing ear recognition accuracy via modifying the AdaBoost algorithm to optimize adaptive learning. To overcome the limitation of image illumination, occlusion, and problems of image registration, the Scale-invariant feature transform technique was used to extract features. Various consecutive phases were used to improve classification accuracy. These phases are image acquisition, preprocessing, filtering, smoothing, and feature extraction. To assess the proposed
... Show MoreIn the present study, the effect of new cross-section fin geometries on overall thermal/fluid performance had been investigated. The cross-section included the base original geometry of (triangular, square, circular, and elliptical pin fins) by adding exterior extra fins along the sides of the origin fins. The present extra fins include rectangular extra fin of 2 mm (height) and 4 mm (width) and triangular extra fin of 2 mm (base) 4 mm (height). The use of entropy generation minimization method (EGM) allows the combined effect of thermal resistance and pressure drop to be assessed through the simultaneous interaction with the heat sink. A general dimensionless expression for the entropy generation rate is obtained by con
... Show MoreIn this research the researcher had the concept of uncertainty in terms of types and theories of treatment and measurement as it was taken up are three types of indeterminacy and volatility and inconsistency