OpenStreetMap (OSM), recognised for its current and readily accessible spatial database, frequently serves regions lacking precise data at the necessary granularity. Global collaboration among OSM contributors presents challenges to data quality and uniformity, exacerbated by the sheer volume of input and indistinct data annotation protocols. This study presents a methodological improvement in the spatial accuracy of OSM datasets centred over Baghdad, Iraq, utilising data derived from OSM services and satellite imagery. An analytical focus was placed on two geometric correction methods: a two-dimensional polynomial affine transformation and a two-dimensional polynomial conformal transformation. The former involves twelve coefficients for adjustment, while the latter encompasses six. Analysis within the selected region exposed variances in positional accuracy, with distinctions evident between Easting (E) and Northing (N) coordinates. Empirical results indicated that the conformal transformation method reduced the Root Mean Square Error (RMSE) by 4.434 meters in the amended OSM data. Contrastingly, the affine transformation method exhibited a further reduction in total RMSE by 4.053 meters. The deployment of these proposed techniques substantiates a marked enhancement in the geometric fidelity of OSM data. The refined datasets have significant applications, extending to the representation of roadmaps, the analysis of traffic flow, and the facilitation of urban planning initiatives.
Motifs template is the input for many bioinformatics systems such codons finding, transcription, transaction, sequential pattern miner, and bioinformatics databases analysis. The size of motifs arranged from one base up to several Mega bases, therefore, the typing errors increase according to the size of motifs. In addition, when the structures motifs are submitted to bioinformatics systems, the specifications of motifs components are required, i.e. the simple motifs, gaps, and the lower bound and upper bound of each gap. The motifs can be of DNA, RNA, or Protein. In this research, a motif parser and visualization module is designed depending on a proposed a context free grammar, CFG, and colors human recognition system. GFC describes the m
... Show MoreThis paper introduces a non-conventional approach with multi-dimensional random sampling to solve a cocaine abuse model with statistical probability. The mean Latin hypercube finite difference (MLHFD) method is proposed for the first time via hybrid integration of the classical numerical finite difference (FD) formula with Latin hypercube sampling (LHS) technique to create a random distribution for the model parameters which are dependent on time [Formula: see text]. The LHS technique gives advantage to MLHFD method to produce fast variation of the parameters’ values via number of multidimensional simulations (100, 1000 and 5000). The generated Latin hypercube sample which is random or non-deterministic in nature is further integ
... Show MoreWireless sensor applications are susceptible to energy constraints. Most of the energy is consumed in communication between wireless nodes. Clustering and data aggregation are the two widely used strategies for reducing energy usage and increasing the lifetime of wireless sensor networks. In target tracking applications, large amount of redundant data is produced regularly. Hence, deployment of effective data aggregation schemes is vital to eliminate data redundancy. This work aims to conduct a comparative study of various research approaches that employ clustering techniques for efficiently aggregating data in target tracking applications as selection of an appropriate clustering algorithm may reflect positive results in the data aggregati
... Show MoreData scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for
In light of increasing demand for energy consumption due to life complexity and its requirements, which reflected on architecture in type and size, Environmental challenges have emerged in the need to reduce emissions and power consumption within the construction sector. Which urged designers to improve the environmental performance of buildings by adopting new design approaches, Invest digital technology to facilitate design decision-making, in short time, effort and cost. Which doesn’t stop at the limits of acceptable efficiency, but extends to the level of (the highest performance), which doesn’t provide by traditional approaches that adopted by researchers and local institutions in their studies and architectural practices, limit
... Show MoreReliability analysis methods are used to evaluate the safety of reinforced concrete structures by evaluating the limit state function 𝑔(𝑋𝑖). For implicit limit state function and nonlinear analysis , an advanced reliability analysis methods are needed. Monte Carlo simulation (MCS) can be used in this case however, as the number of input variables increases, the time required for MCS also increases, making it a time consuming method especially for complex problems with implicit performance functions. In such cases, MCS-based FORM (First Order Reliability Method) and Artificial Neural Network-based FORM (ANN FORM) have been proposed as alternatives. However, it is important to note that both MCS-FORM and ANN-FORM can also be time-con
... Show MoreResearch Objectives: The research aims to highlight the approach of Imam Al-Qaradawi in contemporary jurisprudence in the recent issues of the jurisprudence of minorities, and mentioning the foundations of jurisprudence of minorities, along with some of the practical applications of Imam Al-Qaradawi.
Study Methodology: The researcher applied the inductive, analytical and comparative approach by tracking the scientific material related to the subject of the study from the books of Al-Qaradawi in the first place, then by comparing the legal provisions with what had been stated in the four schools of jurisprudence.
Findings: The interest and need of Muslim minorities in non-
... Show MorePore pressure means the pressure of the fluid filling the pore space of formations. When pore pressure is higher than hydrostatic pressure, it is named abnormal pore pressure or overpressure. When abnormal pressure occurred leads to many severe problems such as well kick, blowout during the drilling, then, prediction of this pressure is crucially essential to reduce cost and to avoid drilling problems that happened during drilling when this pressure occurred. The purpose of this paper is the determination of pore pressure in all layers, including the three formations (Yamama, Suliay, and Gotnia) in a deep exploration oil well in West Qurna field specifically well no. WQ-15 in the south of Iraq. In this study, a new appro
... Show MoreSince the beginning of the last century, the competition for water resources has intensified dramatically, especially between countries that have no agreements in place for water resources that they share. Such is the situation with the Euphrates River which flows through three countries (Turkey, Syria, and Iraq) and represents the main water resource for these countries. Therefore, the comprehensive hydrologic investigation needed to derive optimal operations requires reliable forecasts. This study aims to analysis and create a forecasting model for data generation from Turkey perspective by using the recorded inflow data of Ataturk reservoir for the period (Oct. 1961 - Sep. 2009). Based on 49 years of real inflow data
... Show More