OpenStreetMap (OSM), recognised for its current and readily accessible spatial database, frequently serves regions lacking precise data at the necessary granularity. Global collaboration among OSM contributors presents challenges to data quality and uniformity, exacerbated by the sheer volume of input and indistinct data annotation protocols. This study presents a methodological improvement in the spatial accuracy of OSM datasets centred over Baghdad, Iraq, utilising data derived from OSM services and satellite imagery. An analytical focus was placed on two geometric correction methods: a two-dimensional polynomial affine transformation and a two-dimensional polynomial conformal transformation. The former involves twelve coefficients for adjustment, while the latter encompasses six. Analysis within the selected region exposed variances in positional accuracy, with distinctions evident between Easting (E) and Northing (N) coordinates. Empirical results indicated that the conformal transformation method reduced the Root Mean Square Error (RMSE) by 4.434 meters in the amended OSM data. Contrastingly, the affine transformation method exhibited a further reduction in total RMSE by 4.053 meters. The deployment of these proposed techniques substantiates a marked enhancement in the geometric fidelity of OSM data. The refined datasets have significant applications, extending to the representation of roadmaps, the analysis of traffic flow, and the facilitation of urban planning initiatives.
Purpose – The Cloud computing (CC) and its services have enabled the information centers of organizations to adapt their informatic and technological infrastructure and making it more appropriate to develop flexible information systems in the light of responding to the informational and knowledge needs of their users. In this context, cloud-data governance has become more complex and dynamic, requiring an in-depth understanding of the data management strategy at these centers in terms of: organizational structure and regulations, people, technology, process, roles and responsibilities. Therefore, our paper discusses these dimensions as challenges that facing information centers in according to their data governance and the impa
... Show MoreCryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms to
... Show MoreUnderwater Wireless Sensor Networks (UWSNs) have emerged as a promising technology for a wide range of ocean monitoring applications. The UWSNs suffer from unique challenges of the underwater environment, such as dynamic and sparse network topology, which can easily lead to a partitioned network. This results in hotspot formation and the absence of the routing path from the source to the destination. Therefore, to optimize the network lifetime and limit the possibility of hotspot formation along the data transmission path, the need to plan a traffic-aware protocol is raised. In this research, we propose a traffic-aware routing protocol called PG-RES, which is predicated on the ideas of Pressure Gradient and RESistance concept. The proposed
... Show MoreA two time step stochastic multi-variables multi-sites hydrological data forecasting model was developed and verified using a case study. The philosophy of this model is to use the cross-variables correlations, cross-sites correlations and the two steps time lag correlations simultaneously, for estimating the parameters of the model which then are modified using the mutation process of the genetic algorithm optimization model. The objective function that to be minimized is the Akiake test value. The case study is of four variables and three sites. The variables are the monthly air temperature, humidity, precipitation, and evaporation; the sites are Sulaimania, Chwarta, and Penjwin, which are located north Iraq. The model performance was
... Show MoreLongitudinal data is becoming increasingly common, especially in the medical and economic fields, and various methods have been analyzed and developed to analyze this type of data.
In this research, the focus was on compiling and analyzing this data, as cluster analysis plays an important role in identifying and grouping co-expressed subfiles over time and employing them on the nonparametric smoothing cubic B-spline model, which is characterized by providing continuous first and second derivatives, resulting in a smoother curve with fewer abrupt changes in slope. It is also more flexible and can pick up on more complex patterns and fluctuations in the data.
The longitudinal balanced data profile was compiled into subgroup
... Show MoreIn this study, we made a comparison between LASSO & SCAD methods, which are two special methods for dealing with models in partial quantile regression. (Nadaraya & Watson Kernel) was used to estimate the non-parametric part ;in addition, the rule of thumb method was used to estimate the smoothing bandwidth (h). Penalty methods proved to be efficient in estimating the regression coefficients, but the SCAD method according to the mean squared error criterion (MSE) was the best after estimating the missing data using the mean imputation method
Reliable data transfer and energy efficiency are the essential considerations for network performance in resource-constrained underwater environments. One of the efficient approaches for data routing in underwater wireless sensor networks (UWSNs) is clustering, in which the data packets are transferred from sensor nodes to the cluster head (CH). Data packets are then forwarded to a sink node in a single or multiple hops manners, which can possibly increase energy depletion of the CH as compared to other nodes. While several mechanisms have been proposed for cluster formation and CH selection to ensure efficient delivery of data packets, less attention has been given to massive data co
In this study, the optimum conditions for COD removal from petroleum refinery wastewater by using a combined electrocoagulation- electro-oxidation system were attained by Taguchi method. An orthogonal array experimental design (L18) which is of four controllable parameters including NaCl concentration, C.D. (current density), PH, and time (time of electrolysis) was employed. Chemical oxygen demand (COD) removal percentage was considered as the quality characteristics to be enhanced. Also, the value of turbidity and TDS (total dissolved solid) were estimated. The optimum levels of the studied parameters were determined precisely by implementing S/N analysis and analysis of variance (ANOVA). The optimum conditions were found to be NaCl = 2.5
... Show More