Delays occur commonly in construction projects. Assessing the impact of delay is sometimes a contentious
issue. Several delay analysis methods are available but no one method can be universally used over another in
all situations. The selection of the proper analysis method depends upon a variety of factors including
information available, time of analysis, capabilities of the methodology, and time, funds and effort allocated to the analysis. This paper presents computerized schedule analysis programmed that use daily windows analysis method as it recognized one of the most credible methods, and it is one of the few techniques much more likely to be accepted by courts than any other method. A simple case study has been implemented to demonstrate the accuracy and usefulness of the proposed delay analysis model. The results of the study indicate that the outcomes of delay analyses are often not predictable that each method may yield different results. The study also revealed that depending on the time and resources available, and the accessibility of project control documentation, one method may be more practical or cost-effective.
The penalized least square method is a popular method to deal with high dimensional data ,where the number of explanatory variables is large than the sample size . The properties of penalized least square method are given high prediction accuracy and making estimation and variables selection
At once. The penalized least square method gives a sparse model ,that meaning a model with small variables so that can be interpreted easily .The penalized least square is not robust ,that means very sensitive to the presence of outlying observation , to deal with this problem, we can used a robust loss function to get the robust penalized least square method ,and get robust penalized estimator and
... Show MoreDiabetes is one of the increasing chronic diseases, affecting millions of people around the earth. Diabetes diagnosis, its prediction, proper cure, and management are compulsory. Machine learning-based prediction techniques for diabetes data analysis can help in the early detection and prediction of the disease and its consequences such as hypo/hyperglycemia. In this paper, we explored the diabetes dataset collected from the medical records of one thousand Iraqi patients. We applied three classifiers, the multilayer perceptron, the KNN and the Random Forest. We involved two experiments: the first experiment used all 12 features of the dataset. The Random Forest outperforms others with 98.8% accuracy. The second experiment used only five att
... Show MoreThe research discusses the problem of salaries in the public sector in terms of the process of analyzing its structure and the possibility of benefiting from the information provided by the analysis process for the strategic planning process, and the General Authority for Groundwater has been adopted and one of the formations of the Ministry of Water Resources, which is centrally funded, to represent the salary structure of its employees (1117) employees be a field of research, as the salary structure in it was analyzed for the period between (2014-2019) using the quantitative approach to analysis and by relying on a number of statistical tools in the analysis process, including mathematical circles, upper limits, lower limits, p
... Show MoreA Mobile Ad hoc Network (MANET) is a collection of mobile nodes, that forms on the fly a temporary wireless multi-hop network in a self-organizing way, without relying on any established infrastructure. In MANET, a pair of nodes exchange messages either over a direct wireless link, or over a sequence of wireless links including one or more intermediate nodes. For this purpose, an efficient routing protocol is required. This paper introduced performance study of three of MANET protocols (AODV, GRP and OSPFv3). This study was one of the newer studies because wireless communication played an important role in today’s application and the field of mobile ad hoc network becomes very popular for the researchers in the last years. This study w
... Show MoreMost companies use social media data for business. Sentiment analysis automatically gathers analyses and summarizes this type of data. Managing unstructured social media data is difficult. Noisy data is a challenge to sentiment analysis. Since over 50% of the sentiment analysis process is data pre-processing, processing big social media data is challenging too. If pre-processing is carried out correctly, data accuracy may improve. Also, sentiment analysis workflow is highly dependent. Because no pre-processing technique works well in all situations or with all data sources, choosing the most important ones is crucial. Prioritization is an excellent technique for choosing the most important ones. As one of many Multi-Criteria Decision Mak
... Show MoreVideo copyright protection is the most generally acknowledged method of preventing data piracy. This paper proposes a blind video copyright protection technique based on the Fast Walsh Hadamard Transform (FWHT), Discrete Wavelet Transform (DWT), and Arnold Map. The proposed method chooses only frames with maximum and minimum energy features to host the watermark. It also exploits the advantages of both the fast Walsh Hadamard transform (FWHT) and discrete wavelet transforms (DWT) for watermark embedding. The Arnold map encrypts watermarks before the embedding process and decrypts watermarks after extraction. The results show that the proposed method can achieve a fast embedding time, good transparency, and robustness against various
... Show MoreInefficient wastewater disposal and wastewater discharge problems in water bodies have led to increasing pollution in water bodies. Pollutants in the river contribute to increasing the biological oxygen demand (BOD), total suspended solids (SS), total dissolved solids (TDS), chemical oxygen demand (COD), and toxic metals render this water unsuitable for consumption and even pose a significant risk to human health. Over the last few years, water conservation has been the subject of growing awareness and concern throughout the world, so this research focused on review studies of researches that studied the importance of water quality of wastewater treated disposal in water bodies and modern technology to management w
... Show MoreZigbee, which has the standard IEEE 802.15.4. It is advisable method to build wireless personal area network (WPAN) which demands a low power consumption that can be produced by Zigbee technique. Our paper gives measuring efficiency of Zigbee involving the Physical Layer (PL) and Media Access Control (MAC) sub-layer , which allow a simple interaction between the sensors. We model and simulate two different scenarios, in the first one, we tested the topological characteristics and performance of the IEEE802.15.4 standard in terms of throughput, node to node delay and figure of routers for three network layouts (Star, Mesh and Cluster Tree) using OPNET simulator. The second scenario investigates the self-healing feature on a mesh
... Show MoreThe objective of all planning research is to plan for human comfort and safety, and one of the most significant natural dangers to which humans are exposed is earthquake risk; therefore, earthquake risks must be anticipated, and with the advancement of global technology, it is possible to obtain information on earthquake hazards. GIS has been utilized extensively in the field of environmental assessment research due to its high potential, and GIS is a crucial application in seismic risk assessment. This paper examines the methodologies used in recent GIS-based seismic risk studies, their primary environmental impacts on urban areas, and the complexity of the relationship between the applied methodological approaches and the resulting env
... Show MoreMRY *Khalid Sh. Sharhan, *Naseer Shukur Hussein, INTERNATIONAL JOURNAL OF DEVELOPMENT IN SOCIAL SCIENCE AND HUMANITIES, 2021