Projects suspensions are between the most insistent tasks confronted by the construction field accredited to the sector’s difficulty and its essential delay risk foundations’ interdependence. Machine learning provides a perfect group of techniques, which can attack those complex systems. The study aimed to recognize and progress a wellorganized predictive data tool to examine and learn from delay sources depend on preceding data of construction projects by using decision trees and naïve Bayesian classification algorithms. An intensive review of available data has been conducted to explore the real reasons and causes of construction project delays. The results show that the postponement of delay of interim payments is at the forefront of delay factors caused by the employer’s decision. Even the least one is to leave the job site caused by the contractor’s second part of the contract, the repeated unjustified stopping of the work at the site, without permission or notice from the client’s representatives. The developed model was applied to about 97 projects and used as a prediction model. The decision tree model shows higher accuracy in the prediction.
This paper deals with, Bayesian estimation of the parameters of Gamma distribution under Generalized Weighted loss function, based on Gamma and Exponential priors for the shape and scale parameters, respectively. Moment, Maximum likelihood estimators and Lindley’s approximation have been used effectively in Bayesian estimation. Based on Monte Carlo simulation method, those estimators are compared in terms of the mean squared errors (MSE’s).
Abstract
For sparse system identification,recent suggested algorithms are
-norm Least Mean Square (
-LMS), Zero-Attracting LMS (ZA-LMS), Reweighted Zero-Attracting LMS (RZA-LMS), and p-norm LMS (p-LMS) algorithms, that have modified the cost function of the conventional LMS algorithm by adding a constraint of coefficients sparsity. And so, the proposed algorithms are named
-ZA-LMS,
The purpose of this paper is to solve the stochastic demand for the unbalanced transport problem using heuristic algorithms to obtain the optimum solution, by minimizing the costs of transporting the gasoline product for the Oil Products Distribution Company of the Iraqi Ministry of Oil. The most important conclusions that were reached are the results prove the possibility of solving the random transportation problem when the demand is uncertain by the stochastic programming model. The most obvious finding to emerge from this work is that the genetic algorithm was able to address the problems of unbalanced transport, And the possibility of applying the model approved by the oil products distribution company in the Iraqi Ministry of Oil to m
... Show MoreToday with increase using social media, a lot of researchers have interested in topic extraction from Twitter. Twitter is an unstructured short text and messy that it is critical to find topics from tweets. While topic modeling algorithms such as Latent Semantic Analysis (LSA) and Latent Dirichlet Allocation (LDA) are originally designed to derive topics from large documents such as articles, and books. They are often less efficient when applied to short text content like Twitter. Luckily, Twitter has many features that represent the interaction between users. Tweets have rich user-generated hashtags as keywords. In this paper, we exploit the hashtags feature to improve topics learned
Existing literature suggests that construction worker safety could be optimized using emerging technologies. However, the application of safety technologies in the construction industry is limited. One reason for the constrained adoption of safety technologies is the lack of empirical information for mitigating the risk of a failed adoption. The purpose of this paper is to fill the research gap through identifying key factors that predict successful adoption of safety technologies.
In total, 26 key technology adoption predictors
Lung cancer is one of the most serious and prevalent diseases, causing many deaths each year. Though CT scan images are mostly used in the diagnosis of cancer, the assessment of scans is an error-prone and time-consuming task. Machine learning and AI-based models can identify and classify types of lung cancer quite accurately, which helps in the early-stage detection of lung cancer that can increase the survival rate. In this paper, Convolutional Neural Network is used to classify Adenocarcinoma, squamous cell carcinoma and normal case CT scan images from the Chest CT Scan Images Dataset using different combinations of hidden layers and parameters in CNN models. The proposed model was trained on 1000 CT Scan Images of cancerous and non-c
... Show MoreAbstract:
Objectives: The present study aims to evaluate effectiveness of educational program the nurses' knowledge towards early prediction of acquired weakness in the intensive care unit.
Methodology: A pre-experimental study design (comparison of two groups), which was achieved through the pre and post-test method for the study sample through the application of an educational program in the intensive care unit of Al-Zahra Teaching Hospital in Kut city, Wasit Governorate. The study was conducted for the period from 28th April 2022 to 15th August 2022 by selecting a purposive (non-probability) sample for this study. The study sample size was (52) nu
... Show More
