In the field of civil engineering, the adoption and use of Falling Weight Deflectometers (FWDs) is seen as a response to the ever changing and technology-driven world. Specifically, FWDs refer to devices that aid in evaluating the physical properties of a pavement. This paper has assessed the concepts of data processing, storage, and analysis via FWDs. The device has been found to play an important role in enabling the operators and field practitioners to understand vertical deflection responses upon subjecting pavements to impulse loads. In turn, the resultant data and its analysis outcomes lead to the backcalculation of the state of stiffness, with initial analyses of the deflection bowl occurring in conjunction with the measured or assumed layer thicknesses. In turn, outcomes from the backcalculation processes lead to the understanding of the nature of the strains, stresses, and moduli in the individual layers; besides layer thickness sensitivity, the determination of isotropic layer moduli, and establishing estimates in the subgrade CBR. Overall, impositions of elastic and low strain conditions foster the determination of resilient modulus and the analysis of unbound granular materials. Hence, FWD data processing, analysis, and storage gain significance in civil engineering because it informs the nature of designing new pavements and other rehabilitation design options.
The research deals with A very important two subjects, computer aided process planning (CAPP) and Quality of product with its dimintions which identified by the producer organization, the goal of the research is to Highlight and know the role of the CAPP technology to improve quality of the product of (rotor) in the engines factory in the general company for electrical industries, The research depends case study style by the direct visits of researcher to the work location to apply the operational paths generated by specialized computer program designed by researcher, and research divides into four axes, the first regard to the general structure of the research, the second to the theoretical review, the t
... Show MoreThe aerodynamic characteristics of the forward swept wing aircraft have been studied theoretically and experimentally. Low order panel method with the Dirichlet boundary condition have been used to solve the case of the steady, inviscid and compressible flow. Experimentally, a model was manufactured from wood to carry out the tests. The primary objective of the experimental work was the measurements of the wake dimensions and orientation, velocity defect along the wake and the wake thickness. A blower type low speed (open jet) wind tunnel was used in the experimental work. The mean velocity at the test section was (9.3 m/s) and the Reynolds number based on the mean aerodynamic chord and the mean velocity was (0.46x105). The measurements sho
... Show MoreThis paper identifies and describes the textual densities of ideational metaphors through the application of GM theory (Halliday, 1994) to the textual analysis of two twentieth century English short stories: one American (The Mansion (1910-11), by Henry Jackson van Dyke Jr.), and one British (Home (1951), by William Somerset Maugham). One aim is to get at textually verifiable statistical evidence that attests to the observed dominance of GM nominalization in academic and scientific texts, rather than to fiction (e.g. Halliday and Martin (1993). Another aim is to explore any significant differentiation in GM’s us by the two short- story writers. The research has been carried out by identifying, describing, and statistically analysi
... Show MoreDatabase is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show MoreAbstracts:
The Central Bank is the backbone of the banking system as a whole, and in order to maintain the banking system, one of the most important functions that the Central Bank performs is the function of supervising and controlling banks, with several tools and methods, and one of the most important of these tools is its creation of the function of a compliance observer, which obligated commercial banks to appoint a person in A bank that performs this function according to certain conditions and granting it some powers that would build a sound and compliant banking system. The function of the compliance observer is to follow up on the bank’s compliance with the instructions and decisions issued by
... Show MoreIn this research, a factorial experiment (4*4) was studied, applied in a completely random block design, with a size of observations, where the design of experiments is used to study the effect of transactions on experimental units and thus obtain data representing experiment observations that The difference in the application of these transactions under different environmental and experimental conditions It causes noise that affects the observation value and thus an increase in the mean square error of the experiment, and to reduce this noise, multiple wavelet reduction was used as a filter for the observations by suggesting an improved threshold that takes into account the different transformation levels based on the logarithm of the b
... Show MoreThe research aims to get acquainted with the evaluation of the reality of the application of the curriculum axis from among the eight Iraqi academic accreditation standards in a sample of governmental and private universities and colleges in Iraq and to identify the main and secondary reasons for it as well as to provide proposed mechanisms and procedures to help reduce gaps, If the research problem is represented in the weak availability of the requirements of the curriculum axis in universities and colleges (the study sample) due to the weak documentation and successful implementation of them and interest in them is still below the level of ambition, In order to arrive at scientific facts, the researchers adopted the comparativ
... Show MoreBusiness organizations have faced many challenges in recent times, most important of which is information technology, because it is widely spread and easy to use. Its use has led to an increase in the amount of data that business organizations deal with an unprecedented manner. The amount of data available through the internet is a problem that many parties seek to find solutions for. Why is it available there in this huge amount randomly? Many expectations have revealed that in 2017, there will be devices connected to the internet estimated at three times the population of the Earth, and in 2015 more than one and a half billion gigabytes of data was transferred every minute globally. Thus, the so-called data mining emerged as a
... Show MoreConcurrently with the technological development that the world is witnessing the crime of money laundering to evolve faster and with multiple methods and its economic, political and social impacts raised increasingly. And for phenomenon dangerous the international community in recent years is keen to be considered combating money laundering as a general indication whereby verification of the international response the stats and its banks and financial institutions with international requirements mandated in this aspect, so the increasing interest the governments of countries in the laws and procedures that contribute to the reduction of the phenomenon of money laundering and avoid legislation economy and the banking and financial sectors
... Show More