OpenStreetMap (OSM) represents the most common example of online volunteered mapping applications. Most of these platforms are open source spatial data collected by non-experts volunteers using different data collection methods. OSM project aims to provide a free digital map for all the world. The heterogeneity in data collection methods made OSM project databases accuracy is unreliable and must be dealt with caution for any engineering application. This study aims to assess the horizontal positional accuracy of three spatial data sources are OSM road network database, high-resolution Satellite Image (SI), and high-resolution Aerial Photo (AP) of Baghdad city with respect to an analogue formal road network dataset obtained from the Mayoralty of Baghdad (MB). The methodology of, U.S. National Standard Spatial Data Accuracy (NSSDA) was applied to measure the degree of agreement between each data source and the formal dataset (MB) in terms of horizontal positional accuracy by computing RMSE and NSSDA values. The study concluded that each of the three data sources does not agree with the MB dataset in both study sites AL-Aadhamiyah and AL-Kadhumiyah in terms of positional accuracy.
In this research, a study is introduced on the effect of several environmental factors on the performance of an already constructed quality inspection system, which was designed using a transfer learning approach based on convolutional neural networks. The system comprised two sets of layers, transferred layers set from an already trained model (DenseNet121) and a custom classification layers set. It was designed to discriminate between damaged and undamaged helical gears according to the configuration of the gear regardless to its dimensions, and the model showed good performance discriminating between the two products at ideal conditions of high-resolution images.
So, this study aimed at testing the system performance at poor s
... Show MoreSupport vector machine (SVM) is a popular supervised learning algorithm based on margin maximization. It has a high training cost and does not scale well to a large number of data points. We propose a multiresolution algorithm MRH-SVM that trains SVM on a hierarchical data aggregation structure, which also serves as a common data input to other learning algorithms. The proposed algorithm learns SVM models using high-level data aggregates and only visits data aggregates at more detailed levels where support vectors reside. In addition to performance improvements, the algorithm has advantages such as the ability to handle data streams and datasets with imbalanced classes. Experimental results show significant performance improvements in compa
... Show MoreHartha Formation is an overburdened horizon in the X-oilfield which generates a lot of Non-Productive Time (NPT) associated with drilling mud losses. This study has been conducted to investigate the loss events in this formation as well as to provide geological interpretations based on datasets from nine wells in this field of interest. The interpretation was based on different analyses including wireline logs, cuttings descriptions, image logs, and analog data. Seismic and coherency data were also used to formulate the geological interpretations and calibrate that with the loss events of the Hartha Fm.
The results revealed that the upper part of the Hartha Fm. was identified as an interval capable of creating potentia
... Show MoreThis research depends on the relationship between the reflected spectrum, the nature of each target, area and the percentage of its presence with other targets in the unity of the target area. The changes occur in Land cover have been detected for different years using satellite images based on the Modified Spectral Angle Mapper (MSAM) processing, where Landsat satellite images are utilized using two software programming (MATLAB 7.11 and ERDAS imagine 2014). The proposed supervised classification method (MSAM) using a MATLAB program with supervised classification method (Maximum likelihood Classifier) by ERDAS imagine have been used to get farthest precise results and detect environmental changes for periods. Despite using two classificatio
... Show MoreCrime is considered as an unlawful activity of all kinds and it is punished by law. Crimes have an impact on a society's quality of life and economic development. With a large rise in crime globally, there is a necessity to analyze crime data to bring down the rate of crime. This encourages the police and people to occupy the required measures and more effectively restricting the crimes. The purpose of this research is to develop predictive models that can aid in crime pattern analysis and thus support the Boston department's crime prevention efforts. The geographical location factor has been adopted in our model, and this is due to its being an influential factor in several situations, whether it is traveling to a specific area or livin
... Show MoreEach sport has its own energy requirements that differ from the energy requirements of other sports, and a different method is used in each of them, so the trainer must first rely on the principle of privacy in training first, that is, privacy according to the working energy system, that is, he defines the controlling energy system In that event, and how the muscles use the available energy to perform according to the energy production systems. As we find the serving skill is the first volleyball skill with which the team starts the match in order to be able to gain points directly, through knowledge it turns out that there is a weakness in the skill performance, especially the skill of serving and being The key to victory for volle
... Show MoreTwitter data analysis is an emerging field of research that utilizes data collected from Twitter to address many issues such as disaster response, sentiment analysis, and demographic studies. The success of data analysis relies on collecting accurate and representative data of the studied group or phenomena to get the best results. Various twitter analysis applications rely on collecting the locations of the users sending the tweets, but this information is not always available. There are several attempts at estimating location based aspects of a tweet. However, there is a lack of attempts on investigating the data collection methods that are focused on location. In this paper, we investigate the two methods for obtaining location-based dat
... Show MoreIn this paper, certain types of regularity of topological spaces have been highlighted, which fall within the study of generalizations of separation axioms. One of the important axioms of separation is what is called regularity, and the spaces that have this property are not few, and the most important of these spaces are Euclidean spaces. Therefore, limiting this important concept to topology is within a narrow framework, which necessitates the use of generalized open sets to obtain more good characteristics and preserve the properties achieved in general topology. Perhaps the reader will realize through the research that our generalization preserved most of the characteristics, the most important of which is the hereditary property. Two t
... Show MoreA fast moving infrared excess source (G2) which is widely interpreted as a core-less gas and dust cloud approaches Sagittarius A* (Sgr A*) on a presumably elliptical orbit. VLT