Projects suspensions are between the most insistent tasks confronted by the construction field accredited to the sector’s difficulty and its essential delay risk foundations’ interdependence. Machine learning provides a perfect group of techniques, which can attack those complex systems. The study aimed to recognize and progress a wellorganized predictive data tool to examine and learn from delay sources depend on preceding data of construction projects by using decision trees and naïve Bayesian classification algorithms. An intensive review of available data has been conducted to explore the real reasons and causes of construction project delays. The results show that the postponement of delay of interim payments is at the forefront of delay factors caused by the employer’s decision. Even the least one is to leave the job site caused by the contractor’s second part of the contract, the repeated unjustified stopping of the work at the site, without permission or notice from the client’s representatives. The developed model was applied to about 97 projects and used as a prediction model. The decision tree model shows higher accuracy in the prediction.
An oil spill is a leakage of pipelines, vessels, oil rigs, or tankers that leads to the release of petroleum products into the marine environment or on land that happened naturally or due to human action, which resulted in severe damages and financial loss. Satellite imagery is one of the powerful tools currently utilized for capturing and getting vital information from the Earth's surface. But the complexity and the vast amount of data make it challenging and time-consuming for humans to process. However, with the advancement of deep learning techniques, the processes are now computerized for finding vital information using real-time satellite images. This paper applied three deep-learning algorithms for satellite image classification
... Show MoreAssessing the accuracy of classification algorithms is paramount as it provides insights into reliability and effectiveness in solving real-world problems. Accuracy examination is essential in any remote sensing-based classification practice, given that classification maps consistently include misclassified pixels and classification misconceptions. In this study, two imaginary satellites for Duhok province, Iraq, were captured at regular intervals, and the photos were analyzed using spatial analysis tools to provide supervised classifications. Some processes were conducted to enhance the categorization, like smoothing. The classification results indicate that Duhok province is divided into four classes: vegetation cover, buildings,
... Show MoreWith the rapid development of computers and network technologies, the security of information in the internet becomes compromise and many threats may affect the integrity of such information. Many researches are focused theirs works on providing solution to this threat. Machine learning and data mining are widely used in anomaly-detection schemes to decide whether or not a malicious activity is taking place on a network. In this paper a hierarchical classification for anomaly based intrusion detection system is proposed. Two levels of features selection and classification are used. In the first level, the global feature vector for detection the basic attacks (DoS, U2R, R2L and Probe) is selected. In the second level, four local feature vect
... Show MoreThe theater has a living environment that resembles or realistically simulates the real life environment on the stage where we see the place, light and living being as elements representing a picture of the life scene and for a period of time the theater merely conveyed that image, but with the development of the world industrially and technologically, the perception of this picture has evolved with the emergence of intellectual progress where each part has advantages and Philosophical goals that are consistent with the evolution of form. The theatrical lighting, colors and landscapes have become parts in the composition of a new life component in form and content and based on the above this research is titled&nbs
... Show MoreRecommender Systems are tools to understand the huge amount of data available in the internet world. Collaborative filtering (CF) is one of the most knowledge discovery methods used positively in recommendation system. Memory collaborative filtering emphasizes on using facts about present users to predict new things for the target user. Similarity measures are the core operations in collaborative filtering and the prediction accuracy is mostly dependent on similarity calculations. In this study, a combination of weighted parameters and traditional similarity measures are conducted to calculate relationship among users over Movie Lens data set rating matrix. The advantages and disadvantages of each measure are spotted. From the study, a n
... Show MoreThere is no doubt that the field of education and teaching (in any country and whatever its ruling system varies “d”) is considered one of the most specific and sensitive field, because it is related to the building of human. And as the human is a purpose (aim) and the means in the same time and he is the strategist capital, so he way of his rearing, education, choosing the educator, methods of working and the aims are considered serious matters.
The educational process has aim determined by the society for him self through its working establishments in this field and these are the official and public establishments. And as he feels that the establishments have failed to achieve its d
... Show MoreDocument clustering is the process of organizing a particular electronic corpus of documents into subgroups of similar text features. Formerly, a number of conventional algorithms had been applied to perform document clustering. There are current endeavors to enhance clustering performance by employing evolutionary algorithms. Thus, such endeavors became an emerging topic gaining more attention in recent years. The aim of this paper is to present an up-to-date and self-contained review fully devoted to document clustering via evolutionary algorithms. It firstly provides a comprehensive inspection to the document clustering model revealing its various components with its related concepts. Then it shows and analyzes the principle research wor
... Show More: The need for means of transmitting data in a confidential and secure manner has become one of the most important subjects in the world of communications. Therefore, the search began for what would achieve not only the confidentiality of information sent through means of communication, but also high speed of transmission and minimal energy consumption, Thus, the encryption technology using DNA was developed which fulfills all these requirements [1]. The system proposes to achieve high protection of data sent over the Internet by applying the following objectives: 1. The message is encrypted using one of the DNA methods with a key generated by the Diffie-Hellman Ephemeral algorithm, part of this key is secret and this makes the pro
... Show More