OpenStreetMap (OSM), recognised for its current and readily accessible spatial database, frequently serves regions lacking precise data at the necessary granularity. Global collaboration among OSM contributors presents challenges to data quality and uniformity, exacerbated by the sheer volume of input and indistinct data annotation protocols. This study presents a methodological improvement in the spatial accuracy of OSM datasets centred over Baghdad, Iraq, utilising data derived from OSM services and satellite imagery. An analytical focus was placed on two geometric correction methods: a two-dimensional polynomial affine transformation and a two-dimensional polynomial conformal transformation. The former involves twelve coefficients for adjustment, while the latter encompasses six. Analysis within the selected region exposed variances in positional accuracy, with distinctions evident between Easting (E) and Northing (N) coordinates. Empirical results indicated that the conformal transformation method reduced the Root Mean Square Error (RMSE) by 4.434 meters in the amended OSM data. Contrastingly, the affine transformation method exhibited a further reduction in total RMSE by 4.053 meters. The deployment of these proposed techniques substantiates a marked enhancement in the geometric fidelity of OSM data. The refined datasets have significant applications, extending to the representation of roadmaps, the analysis of traffic flow, and the facilitation of urban planning initiatives.
This paper is devoted to investigate the effect of internal curing technique on the properties of self-compacting concrete (SCC). In this study, SCC is produced by using silica fume (SF) as partial replacement by weight of cement with percentage of (5%), sand is partially replaced by volume with saturated fine lightweight aggregate (LWA) which is thermostone chips as internal curing material in three percentages of (5%, 10% and 15%) for SCC, two external curing conditions water and air. The experimental work was divided into three parts: in the first part, the workability tests of fresh SCC were conducted. The second part included conducting compressive strength test and modulus of rupture test at ages of (7, 28 and 90). The third part i
... Show MoreThe resort to the eloquence of the poetic image as a style reveals the poet's creativity and creativity in dealing with external influences, and reflect them with emotional images express a sense of intense emotional imagination, and this imagination stems from the experience of a poetic sense of truth, tasted by the recipient before the creator of the poetic text.
Motifs template is the input for many bioinformatics systems such codons finding, transcription, transaction, sequential pattern miner, and bioinformatics databases analysis. The size of motifs arranged from one base up to several Mega bases, therefore, the typing errors increase according to the size of motifs. In addition, when the structures motifs are submitted to bioinformatics systems, the specifications of motifs components are required, i.e. the simple motifs, gaps, and the lower bound and upper bound of each gap. The motifs can be of DNA, RNA, or Protein. In this research, a motif parser and visualization module is designed depending on a proposed a context free grammar, CFG, and colors human recognition system. GFC describes the m
... Show MoreGenerally, radiologists analyse the Magnetic Resonance Imaging (MRI) by visual inspection to detect and identify the presence of tumour or abnormal tissue in brain MR images. The huge number of such MR images makes this visual interpretation process, not only laborious and expensive but often erroneous. Furthermore, the human eye and brain sensitivity to elucidate such images gets reduced with the increase of number of cases, especially when only some slices contain information of the affected area. Therefore, an automated system for the analysis and classification of MR images is mandatory. In this paper, we propose a new method for abnormality detection from T1-Weighted MRI of human head scans using three planes, including axial plane, co
... Show MoreThis paper introduces a non-conventional approach with multi-dimensional random sampling to solve a cocaine abuse model with statistical probability. The mean Latin hypercube finite difference (MLHFD) method is proposed for the first time via hybrid integration of the classical numerical finite difference (FD) formula with Latin hypercube sampling (LHS) technique to create a random distribution for the model parameters which are dependent on time [Formula: see text]. The LHS technique gives advantage to MLHFD method to produce fast variation of the parameters’ values via number of multidimensional simulations (100, 1000 and 5000). The generated Latin hypercube sample which is random or non-deterministic in nature is further integ
... Show MoreBig data of different types, such as texts and images, are rapidly generated from the internet and other applications. Dealing with this data using traditional methods is not practical since it is available in various sizes, types, and processing speed requirements. Therefore, data analytics has become an important tool because only meaningful information is analyzed and extracted, which makes it essential for big data applications to analyze and extract useful information. This paper presents several innovative methods that use data analytics techniques to improve the analysis process and data management. Furthermore, this paper discusses how the revolution of data analytics based on artificial intelligence algorithms might provide
... Show MoreReliability analysis methods are used to evaluate the safety of reinforced concrete structures by evaluating the limit state function 𝑔(𝑋𝑖). For implicit limit state function and nonlinear analysis , an advanced reliability analysis methods are needed. Monte Carlo simulation (MCS) can be used in this case however, as the number of input variables increases, the time required for MCS also increases, making it a time consuming method especially for complex problems with implicit performance functions. In such cases, MCS-based FORM (First Order Reliability Method) and Artificial Neural Network-based FORM (ANN FORM) have been proposed as alternatives. However, it is important to note that both MCS-FORM and ANN-FORM can also be time-con
... Show MoreIn light of increasing demand for energy consumption due to life complexity and its requirements, which reflected on architecture in type and size, Environmental challenges have emerged in the need to reduce emissions and power consumption within the construction sector. Which urged designers to improve the environmental performance of buildings by adopting new design approaches, Invest digital technology to facilitate design decision-making, in short time, effort and cost. Which doesn’t stop at the limits of acceptable efficiency, but extends to the level of (the highest performance), which doesn’t provide by traditional approaches that adopted by researchers and local institutions in their studies and architectural practices, limit
... Show MoreDatabase is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show MoreIn the present work, different remote sensing techniques have been used to analyze remote sensing data spectrally using ENVI software. The majority of algorithms used in the Spectral Processing can be organized as target detection, change detection and classification. In this paper several methods of target detection have been studied such as matched filter and constrained energy minimization.
The water body mapping have been obtained and the results showed changes on the study area through the period 1995-2000. Also the results that obtained from applying constrained energy minimization were more accurate than other method comparing with the real situation.