Cost estimation is considered one of the important tasks in the construction projects management. The precise estimation of the construction cost affect on the success and quality of a construction project. Elemental estimation is considered a very important stage to the project team because it represents one of the key project elements. It helps in formulating the basis to strategies and execution plans for construction and engineering. Elemental estimation, which in the early stage, estimates the construction costs depending on . minimum details of the project so that it gives an indication for the initial design stage of a project. This paper studies the factors that affect the elemental cost estimation as well as the relation between these factors using Analytic Hierarchy Process (AHP) method. Final conclusions and recommendations were extracted for better elemental estimation accuracy in project management.
This Book is intended to be textbook studied for undergraduate course in multivariate analysis. This book is designed to be used in semester system. In order to achieve the goals of the book, it is divided into the following chapters (as done in the first edition 2019). Chapter One introduces matrix algebra. Chapter Two devotes to Linear Equation System Solution with quadratic forms, Characteristic roots & vectors. Chapter Three discusses Partitioned Matrices and how to get Inverse, Jacobi and Hessian matrices. Chapter Four deals with Multivariate Normal Distribution (MVN). Chapter Five concern with Joint, Marginal and Conditional Normal Distribution, independency and correlations. While the revised new chapters have been added (as the curr
... Show MoreThe purpose of this article was to identify and assess the importance of risk factors in the tendering phase of construction projects. The construction project cannot succeed without the identification and categorization of these risk elements. In this article, a questionnaire for likelihood and impact was designed and distributed to a panel of specialists to analyze risk factors. The risk matrix was also used to research, explore, and identify the risks that influence the tendering phase of construction projects. The probability and impact values assigned to risk are used to calculate the risk's score. A risk matrix is created by combining probability and impact criteria. To determine the main risk elements for the tender phase of
... Show MoreThe purpose of this article was to identify and assess the importance of risk factors in the tendering phase of construction projects. The construction project cannot succeed without the identification and categorization of these risk elements. In this article, a questionnaire for likelihood and impact was designed and distributed to a panel of specialists to analyze risk factors. The risk matrix was also used to research, explore, and identify the risks that influence the tendering phase of construction projects. The probability and impact values assigned to risk are used to calculate the risk's score. A risk matrix is created by combining probability and impact criteria. To determine the main risk elements for the tend
... Show MoreA sensitive spectrophotometric method was developed for the estimation of cefdinir (CFD), a cephalosporin species. This study involves two methods, and the first method includes the preparing of azo dye by the reaction of CFD diazonium salt with 4-Tert-Butylphenol (4-TBP) and 2-Naphthol (2-NPT) in alkaline medium, which shows colored dyes measured at λmax 490 and 535 nm, respectively. Beer's law was obeyed along the concentration range of (3-100) μg.ml-1. The limits of detection were 0.246, 0.447 μg.ml-1 and molar absorptivities were 0.6129×104, 0.3361×104 L.mol-1cm-1 for (CFD-4-TBP) and (CFD-2-NPT), respectively. The second method includes preconcentration for cefdinir dyes by using cloud point extraction in the presence of Triton
... Show MoreIn this article, performing and deriving the probability density function for Rayleigh distribution by using maximum likelihood estimator method and moment estimator method, then crating the crisp survival function and crisp hazard function to find the interval estimation for scale parameter by using a linear trapezoidal membership function. A new proposed procedure used to find the fuzzy numbers for the parameter by utilizing ( to find a fuzzy numbers for scale parameter of Rayleigh distribution. applying two algorithms by using ranking functions to make the fuzzy numbers as crisp numbers. Then computed the survival functions and hazard functions by utilizing the real data application.
Mass transfer was examined at a stationary rectangular copper electrode (cathode) by using the reduction of cupric ions as the electrochemical reaction. The influence of electrolyte temperature (25, 45, and 65 oC), and cupric ions concentration (4, 8, and 12 mM) on mass transfer coefficient were investigated by using limiting current technique. The mass transfer coefficient and hence the Sherwood number was correlated as Sh =
Advances in gamma imaging technology mean that is now technologically feasible to conduct stereoscopic gamma imaging in a hand-held unit. This paper derives an analytical model for stereoscopic pinhole imaging which can be used to predict performance for a wide range of camera configurations. Investigation of this concept through Monte Carlo and benchtop studies, for an example configuration, shows camera-source distance measurements with a mean deviation between calculated and actual distances of <5 mm for imaging distances of 50–250 mm. By combining this technique with stereoscopic optical imaging, we are then able to calculate the depth of a radioisotope source beneath a surfa
A fault is an error that has effects on system behaviour. A software metric is a value that represents the degree to which software processes work properly and where faults are more probable to occur. In this research, we study the effects of removing redundancy and log transformation based on threshold values for identifying faults-prone classes of software. The study also contains a comparison of the metric values of an original dataset with those after removing redundancy and log transformation. E-learning and system dataset were taken as case studies. The fault ratio ranged from 1%-31% and 0%-10% for the original dataset and 1%-10% and 0%-4% after removing redundancy and log transformation, respectively. These results impacted direct
... Show More