In the field of civil engineering, the adoption and use of Falling Weight Deflectometers (FWDs) is seen as a response to the ever changing and technology-driven world. Specifically, FWDs refer to devices that aid in evaluating the physical properties of a pavement. This paper has assessed the concepts of data processing, storage, and analysis via FWDs. The device has been found to play an important role in enabling the operators and field practitioners to understand vertical deflection responses upon subjecting pavements to impulse loads. In turn, the resultant data and its analysis outcomes lead to the backcalculation of the state of stiffness, with initial analyses of the deflection bowl occurring in conjunction with the measured or assumed layer thicknesses. In turn, outcomes from the backcalculation processes lead to the understanding of the nature of the strains, stresses, and moduli in the individual layers; besides layer thickness sensitivity, the determination of isotropic layer moduli, and establishing estimates in the subgrade CBR. Overall, impositions of elastic and low strain conditions foster the determination of resilient modulus and the analysis of unbound granular materials. Hence, FWD data processing, analysis, and storage gain significance in civil engineering because it informs the nature of designing new pavements and other rehabilitation design options.
Due to the urgent need to develop technologies for continuous glucose monitoring in diabetes individuals, poten tial research has been applied by invoking the microwave tech niques. Therefore, this work presents a novel technique based on a single port microwave circuit, antenna structure, based on Metamaterial (MTM) transmission line defected patch for sensing the blood glucose level in noninvasive process. For that, the proposed antenna is invoked to measure the blood glu cose through the field leakages penetrated to the human blood through the skin. The proposed sensor is constructed from a closed loop connected to an interdigital capacitor to magnify the electric field fringing at the patch center. The proposed an tenna sensor i
... Show Morehe aim of the research is to clarify the meanings and connotations of (Semitic), and to identify the peoples that fell under this name according to historical data, biblical texts and Qur’anic news. International sympathy on the one hand and on the other hand controlling the land of Palestine and giving them international legitimacy to grow their entity according to global support and sympathy with their alleged slogan (anti-Semitism), which revolves around the oppression of the Jews.
This study seeks to identify the role that the leadership trend plays in the management of health institutions in Iraq and its impact on improving the quality of the health service provided by analyzing some opinions of affiliates working in the Iraqi health sector where a survey list was used as a main tool for collecting primary data, as it was subjected to this analysis ( 60) of the medical staff, of whom (40) are doctors and (20) are affiliated with the rank of assistant physician, and (60) members of the administrative cadre have undergone their various job ranks and administrative specializations (department manager, auditor, observer, accountant, statistician, secretary). Reliance on statistical software (spss) in data ana
... Show MoreMost of the medical datasets suffer from missing data, due to the expense of some tests or human faults while recording these tests. This issue affects the performance of the machine learning models because the values of some features will be missing. Therefore, there is a need for a specific type of methods for imputing these missing data. In this research, the salp swarm algorithm (SSA) is used for generating and imputing the missing values in the pain in my ass (also known Pima) Indian diabetes disease (PIDD) dataset, the proposed algorithm is called (ISSA). The obtained results showed that the classification performance of three different classifiers which are support vector machine (SVM), K-nearest neighbour (KNN), and Naïve B
... Show MoreAnomaly detection is still a difficult task. To address this problem, we propose to strengthen DBSCAN algorithm for the data by converting all data to the graph concept frame (CFG). As is well known that the work DBSCAN method used to compile the data set belong to the same species in a while it will be considered in the external behavior of the cluster as a noise or anomalies. It can detect anomalies by DBSCAN algorithm can detect abnormal points that are far from certain set threshold (extremism). However, the abnormalities are not those cases, abnormal and unusual or far from a specific group, There is a type of data that is do not happen repeatedly, but are considered abnormal for the group of known. The analysis showed DBSCAN using the
... Show MoreThe advancements in Information and Communication Technology (ICT), within the previous decades, has significantly changed people’s transmit or store their information over the Internet or networks. So, one of the main challenges is to keep these information safe against attacks. Many researchers and institutions realized the importance and benefits of cryptography in achieving the efficiency and effectiveness of various aspects of secure communication.This work adopts a novel technique for secure data cryptosystem based on chaos theory. The proposed algorithm generate 2-Dimensional key matrix having the same dimensions of the original image that includes random numbers obtained from the 1-Dimensional logistic chaotic map for given con
... Show MoreIn data transmission a change in single bit in the received data may lead to miss understanding or a disaster. Each bit in the sent information has high priority especially with information such as the address of the receiver. The importance of error detection with each single change is a key issue in data transmission field.
The ordinary single parity detection method can detect odd number of errors efficiently, but fails with even number of errors. Other detection methods such as two-dimensional and checksum showed better results and failed to cope with the increasing number of errors.
Two novel methods were suggested to detect the binary bit change errors when transmitting data in a noisy media.Those methods were: 2D-Checksum me
Applications of quantitative methods, which had been explicit attention during previous period (the last two centuries) is the method of application sales man or traveling salesman method. According to this interest by the actual need for a lot of the production sectors and companies that distribute their products, whether locally made or the imported for customers or other industry sectors where most of the productive sectors and companies distributed always aspired to (increase profits, imports, the production quantity, quantity of exports. etc. ...) this is the part of the other hand, want to behave during the process of distribution routes that achieve the best or the least or most appropriate.
... Show MoreSheet piles are necessary with hydraulic structures as seepage cut-off to reduce the seepage. In this research, the computational work methodology was followed by building a numerical model using Geo-Studio program to check the efficiency of using concrete sheet piles as a cut-off or reducer for seepage with time if the sheet piles facing the drawdown technique. Al-Kifil regulator was chosen as a case study, an accurate model was built with a help of observed reading of the measuring devices, which was satisfactory and helped in checking the sheet piles efficiency. Through the study, three scenarios were adopted (with and without) drawdown technique, it was found that at the short time there's no effect of the drawdown technique on
... Show More