In the field of civil engineering, the adoption and use of Falling Weight Deflectometers (FWDs) is seen as a response to the ever changing and technology-driven world. Specifically, FWDs refer to devices that aid in evaluating the physical properties of a pavement. This paper has assessed the concepts of data processing, storage, and analysis via FWDs. The device has been found to play an important role in enabling the operators and field practitioners to understand vertical deflection responses upon subjecting pavements to impulse loads. In turn, the resultant data and its analysis outcomes lead to the backcalculation of the state of stiffness, with initial analyses of the deflection bowl occurring in conjunction with the measured or assumed layer thicknesses. In turn, outcomes from the backcalculation processes lead to the understanding of the nature of the strains, stresses, and moduli in the individual layers; besides layer thickness sensitivity, the determination of isotropic layer moduli, and establishing estimates in the subgrade CBR. Overall, impositions of elastic and low strain conditions foster the determination of resilient modulus and the analysis of unbound granular materials. Hence, FWD data processing, analysis, and storage gain significance in civil engineering because it informs the nature of designing new pavements and other rehabilitation design options.
The research aims at the possibility of measuring the technical and scale efficiency (SE) of the departments of the College of Administration and Economics at the University of Baghdad for a period lasting 8 years, from the academic year 2013-2014 to 2018-2019 using the method of Applied Data Analysis with an input and output orientation to maintain the distinguished competitive position and try to identify weaknesses in performance and address them. Nevertheless, the research problem lies in diagnosing the most acceptable specializations in the labor market and determining the reasons for students’ reluctance to enter some departments. Furthermore, the (Win4DEAp) program was used to measure technical and scale efficiency (SE) and rely on
... Show MoreThis research aims to analyze and simulate biochemical real test data for uncovering the relationships among the tests, and how each of them impacts others. The data were acquired from Iraqi private biochemical laboratory. However, these data have many dimensions with a high rate of null values, and big patient numbers. Then, several experiments have been applied on these data beginning with unsupervised techniques such as hierarchical clustering, and k-means, but the results were not clear. Then the preprocessing step performed, to make the dataset analyzable by supervised techniques such as Linear Discriminant Analysis (LDA), Classification And Regression Tree (CART), Logistic Regression (LR), K-Nearest Neighbor (K-NN), Naïve Bays (NB
... Show MoreDensity functional theory calculations are employed to investigate the impact of edifenphos molecule on the reactivity and electronic sensitivity of pure calcium oxide (CaO) nanocluster. The strong adsorption of edifenphos molecule on CaO nanocluster occurs by the sulfur head of the adsorbate, and the amount of the energy of this adsorption is around − 84.40 kcal/mol. The adsorption of edifenphos molecules results in a decrease in the values of Eg of CaO from 4.67 to 3.56 eV, as well as an increase in the electrical conductance. Moreover, the work function of CaO nanocluster is significantly affected, which changes the current of the field emission electron. Eventually, the recovery time is calculated around 99 ms at ambient temperature f
... Show MoreCyber-attacks keep growing. Because of that, we need stronger ways to protect pictures. This paper talks about DGEN, a Dynamic Generative Encryption Network. It mixes Generative Adversarial Networks with a key system that can change with context. The method may potentially mean it can adjust itself when new threats appear, instead of a fixed lock like AES. It tries to block brute‑force, statistical tricks, or quantum attacks. The design adds randomness, uses learning, and makes keys that depend on each image. That should give very good security, some flexibility, and keep compute cost low. Tests still ran on several public image sets. Results show DGEN beats AES, chaos tricks, and other GAN ideas. Entropy reached 7.99 bits per pix
... Show MoreGrain size and shape are important yield indicators. A hint for reexamining the visual markers of grain weight can be found in the wheat grain width. A digital vernier caliper is used to measure length, width, and thickness. The data consisted of 1296 wheat grains, with measurements for each grain. In this data set, the average weight (We) of the twenty-four grains was measured and recorded. To determine measure of the length (L), width (W), thickness (T), weight (We), and volume(V). These features were manipulated to develop two mathematical models that were passed on to the multiple regression models. The results of the weight model demonstrated that the length and width of the grai
The research aims to identify the factors that affect the quality of the product by using the Failure Mode and Effect Analysis (FMEA) tool and to suggest measures to reduce the deviations or defects in the production process. I used the case study approach to reach its goals, and the air filter product line was chosen in the air filters factory of Al-Zawraa General Company. The research sample was due to the emergence of many defects of different impact and the continuing demand for the product. I collected data and information from the factory records for two years (2018-2019) and used a scheme Pareto Fishbone Diagram as well as an FMEA tool to analyze data and generate results.
Par
... Show MoreIn this paper, we will discuss the performance of Bayesian computational approaches for estimating the parameters of a Logistic Regression model. Markov Chain Monte Carlo (MCMC) algorithms was the base estimation procedure. We present two algorithms: Random Walk Metropolis (RWM) and Hamiltonian Monte Carlo (HMC). We also applied these approaches to a real data set.
This article describes how to predict different types of multiple reflections in pre-track seismic data. The characteristics of multiple reflections can be expressed as a combination of the characteristics of primary reflections. Multiple velocities always come in lower magnitude than the primaries, this is the base for separating them during Normal Move Out correction. The muting procedure is applied in Time-Velocity analysis domain. Semblance plot is used to diagnose multiples availability and judgment for muting dimensions. This processing procedure is used to eliminate internal multiples from real 2D seismic data from southern Iraq in two stages. The first is conventional Normal Move Out correction and velocity auto picking and
... Show More