Rapid worldwide urbanization and drastic population growth have increased the demand for new road construction, which will cause a substantial amount of natural resources such as aggregates to be consumed. The use of recycled concrete aggregate could be one of the possible ways to offset the aggregate shortage problem and reduce environmental pollution. This paper reports an experimental study of unbound granular material using recycled concrete aggregate for pavement subbase construction. Five percentages of recycled concrete aggregate obtained from two different sources with an originally designed compressive strength of 20–30 MPa as well as 31–40 MPa at three particle size levels, i.e., coarse, fine, and extra fine, were tested for their properties, i.e., the optimum moisture content density, Californian bearing ratio, and resilient modulus. A characterization of the resilient modulus of the mixes under complex stress conditions was performed. The characterized modulus model was used in the nonlinear analysis of the pavement structure under traffic loading using KENALYER software. Consequently, the two critical responses, i.e., the tensile strain at the bottom of the asphalt layer and the vertical compressive strain at the top of the subgrade, were computed and compared for the pavement structures with varying types and percentages of recycled concrete aggregate used in the subbase layer.
Longitudinal data is becoming increasingly common, especially in the medical and economic fields, and various methods have been analyzed and developed to analyze this type of data.
In this research, the focus was on compiling and analyzing this data, as cluster analysis plays an important role in identifying and grouping co-expressed subfiles over time and employing them on the nonparametric smoothing cubic B-spline model, which is characterized by providing continuous first and second derivatives, resulting in a smoother curve with fewer abrupt changes in slope. It is also more flexible and can pick up on more complex patterns and fluctuations in the data.
The longitudinal balanced data profile was compiled into subgroup
... Show MoreA medical- service platform is a mobile application through which patients are provided with doctor’s diagnoses based on information gleaned from medical images. The content of these diagnostic results must not be illegitimately altered during transmission and must be returned to the correct patient. In this paper, we present a solution to these problems using blind, reversible, and fragile watermarking based on authentication of the host image. In our proposed algorithm, the binary version of the Bose_Chaudhuri_Hocquengham (BCH) code for patient medical report (PMR) and binary patient medical image (PMI) after fuzzy exclusive or (F-XoR) are used to produce the patient's unique mark using secret sharing schema (SSS). The patient’s un
... Show MoreFace recognition is a crucial biometric technology used in various security and identification applications. Ensuring accuracy and reliability in facial recognition systems requires robust feature extraction and secure processing methods. This study presents an accurate facial recognition model using a feature extraction approach within a cloud environment. First, the facial images undergo preprocessing, including grayscale conversion, histogram equalization, Viola-Jones face detection, and resizing. Then, features are extracted using a hybrid approach that combines Linear Discriminant Analysis (LDA) and Gray-Level Co-occurrence Matrix (GLCM). The extracted features are encrypted using the Data Encryption Standard (DES) for security
... Show MoreThis paper present the fast and robust approach of English text encryption and decryption based on Pascal matrix. The technique of encryption the Arabic or English text or both and show the result when apply this method on plain text (original message) and how will form the intelligible plain text to be unintelligible plain text in order to secure information from unauthorized access and from steel information, an encryption scheme usually uses a pseudo-random enecryption key generated by an algorithm. All this done by using Pascal matrix. Encryption and decryption are done by using MATLAB as programming language and notepad ++to write the input text.This paper present the fast and robust approach of English text encryption and decryption b
... Show MoreA two time step stochastic multi-variables multi-sites hydrological data forecasting model was developed and verified using a case study. The philosophy of this model is to use the cross-variables correlations, cross-sites correlations and the two steps time lag correlations simultaneously, for estimating the parameters of the model which then are modified using the mutation process of the genetic algorithm optimization model. The objective function that to be minimized is the Akiake test value. The case study is of four variables and three sites. The variables are the monthly air temperature, humidity, precipitation, and evaporation; the sites are Sulaimania, Chwarta, and Penjwin, which are located north Iraq. The model performance was
... Show MoreIn this paper, experimental study has been done for temperature distribution in space conditioned with Ventilation Hollow Core Slab (TermoDeck) system. The experiments were carried out on a model room with dimensions of (1m 1.2m 1m) that was built according to a suitable scale factor of (1/4). The temperature distributions was measured by 59 thermocouples fixed in several locations in the test room. Two cases were considered in this work, the first one during unoccupied period at night time (without external load) and the other at day period with external load of 800W/m2 according to solar heat gain calculations during summer season in Iraq. All results confirm the use of TermoDeck system for ventilation and cooling/heat
... Show MoreMachine learning has a significant advantage for many difficulties in the oil and gas industry, especially when it comes to resolving complex challenges in reservoir characterization. Permeability is one of the most difficult petrophysical parameters to predict using conventional logging techniques. Clarifications of the work flow methodology are presented alongside comprehensive models in this study. The purpose of this study is to provide a more robust technique for predicting permeability; previous studies on the Bazirgan field have attempted to do so, but their estimates have been vague, and the methods they give are obsolete and do not make any concessions to the real or rigid in order to solve the permeability computation. To
... Show MoreData Driven Requirement Engineering (DDRE) represents a vision for a shift from the static traditional methods of doing requirements engineering to dynamic data-driven user-centered methods. Data available and the increasingly complex requirements of system software whose functions can adapt to changing needs to gain the trust of its users, an approach is needed in a continuous software engineering process. This need drives the emergence of new challenges in the discipline of requirements engineering to meet the required changes. The problem in this study was the method in data discrepancies which resulted in the needs elicitation process being hampered and in the end software development found discrepancies and could not meet the need
... Show MoreE-Learning packages are content and instructional methods delivered on a computer
(whether on the Internet, or an intranet), and designed to build knowledge and skills related to
individual or organizational goals. This definition addresses: The what: Training delivered
in digital form. The how: By content and instructional methods, to help learn the content.
The why: Improve organizational performance by building job-relevant knowledge and
skills in workers.
This paper has been designed and implemented a learning package for Prolog Programming
Language. This is done by using Visual Basic.Net programming language 2010 in
conjunction with the Microsoft Office Access 2007. Also this package introduces several
fac