The purpose of this paper is to develop a hybrid conceptual model for building information modelling (BIM) adoption in facilities management (FM) through the integration of the technology task fit (TTF) and the unified theory of acceptance and use of technology (UTAUT) theories. The study also aims to identify the influence factors of BIM adoption and usage in FM and identify gaps in the existing literature and to provide a holistic picture of recent research in technology acceptance and adoption in the construction industry and FM sector.
This paper reports an evaluation of the properties of medium-quality concrete incorporating recycled coarse aggregate (RCA). Concrete specimens were prepared with various percentages of the RCA (25%, 50%, 75%, and 100%). The workability, mechanical properties, and durability in terms of abrasion of cured concrete were examined at different ages. The results reveal insignificant differences between the recycled concrete (RC) and reference concrete in terms of the mechanical and durability-related measurements. Meanwhile, the workability of the RC reduced vastly since the replacement of the RCA reached 75% and 100%. The ultrasound pulse velocity (UPV) results greatly depend on the porosity of concrete and the RC exhibited higher poros
... Show MoreThe using of the parametric models and the subsequent estimation methods require the presence of many of the primary conditions to be met by those models to represent the population under study adequately, these prompting researchers to search for more flexible parametric models and these models were nonparametric, many researchers, are interested in the study of the function of permanence and its estimation methods, one of these non-parametric methods.
For work of purpose statistical inference parameters around the statistical distribution for life times which censored data , on the experimental section of this thesis has been the comparison of non-parametric methods of permanence function, the existence
... Show MoreIn many video and image processing applications, the frames are partitioned into blocks, which are extracted and processed sequentially. In this paper, we propose a fast algorithm for calculation of features of overlapping image blocks. We assume the features are projections of the block on separable 2D basis functions (usually orthogonal polynomials) where we benefit from the symmetry with respect to spatial variables. The main idea is based on a construction of auxiliary matrices that virtually extends the original image and makes it possible to avoid a time-consuming computation in loops. These matrices can be pre-calculated, stored and used repeatedly since they are independent of the image itself. We validated experimentally th
... Show MoreThis research aims to analyze and simulate biochemical real test data for uncovering the relationships among the tests, and how each of them impacts others. The data were acquired from Iraqi private biochemical laboratory. However, these data have many dimensions with a high rate of null values, and big patient numbers. Then, several experiments have been applied on these data beginning with unsupervised techniques such as hierarchical clustering, and k-means, but the results were not clear. Then the preprocessing step performed, to make the dataset analyzable by supervised techniques such as Linear Discriminant Analysis (LDA), Classification And Regression Tree (CART), Logistic Regression (LR), K-Nearest Neighbor (K-NN), Naïve Bays (NB
... Show MoreKrawtchouk polynomials (KPs) and their moments are promising techniques for applications of information theory, coding theory, and signal processing. This is due to the special capabilities of KPs in feature extraction and classification processes. The main challenge in existing KPs recurrence algorithms is that of numerical errors, which occur during the computation of the coefficients in large polynomial sizes, particularly when the KP parameter (p) values deviate away from 0.5 to 0 and 1. To this end, this paper proposes a new recurrence relation in order to compute the coefficients of KPs in high orders. In particular, this paper discusses the development of a new algorithm and presents a new mathematical model for computing the
... Show MoreIn this study, the performance of the adaptive optics (AO) system was analyzed through a numerical computer simulation implemented in MATLAB. Making a phase screen involved turning computer-generated random numbers into two-dimensional arrays of phase values on a sample point grid with matching statistics. Von Karman turbulence was created depending on the power spectral density. Several simulated point spread functions (PSFs) and modulation transfer functions (MTFs) for different values of the Fried coherent diameter (ro) were used to show how rough the atmosphere was. To evaluate the effectiveness of the optical system (telescope), the Strehl ratio (S) was computed. The compensation procedure for an AO syst
... Show MoreCopper Telluride Thin films of thickness 700nm and 900nm, prepared thin films using thermal evaporation on cleaned Si substrates kept at 300K under the vacuum about (4x10-5 ) mbar. The XRD analysis and (AFM) measurements use to study structure properties. The sensitivity (S) of the fabricated sensors to NO2 and H2 was measured at room temperature. The experimental relationship between S and thickness of the sensitive film was investigated, and higher S values were recorded for thicker sensors. Results showed that the best sensitivity was attributed to the Cu2Te film of 900 nm thickness at the H2 gas.
A Spectroscopic study has been focused in this article to study one of the main types of active galaxies which are quasars, and to be more precise this research focuses on studying the correlation between the main engine of Quasi-Stellar Objects (QSO), the central black hole mass (SMBH) and other physical properties (e.g. the star formation rate (SFR)). Twelve objects have been randomly selected for “The Half Million Quasars (HMQ) Catalogue” published in 2015 and the data collected from Salon Digital Sky survey (SDSS) Dr. 16. The redshift range of these galaxies were between (0.05 – 0.17). The results show a clear linear proportionality between the SMBH and the SFR, as well as direct proportional between the luminosit
... Show More