It has increasingly been recognised that the future developments in geospatial data handling will centre on geospatial data on the web: Volunteered Geographic Information (VGI). The evaluation of VGI data quality, including positional and shape similarity, has become a recurrent subject in the scientific literature in the last ten years. The OpenStreetMap (OSM) project is the most popular one of the leading platforms of VGI datasets. It is an online geospatial database to produce and supply free editable geospatial datasets for a worldwide. The goal of this paper is to present a comprehensive overview of the quality assurance of OSM data. In addition, the credibility of open source geospatial data is discussed, highlighting the difficulties and challenges of VGI data quality assessment. The conclusion is that for OSM dataset, it is quite difficult to control its quality. It therefore makes sense to use OSM data for applications do not need high quality spatial datasets.
Background: Generally, genetic disorders are a leading cause of spontaneous abortion, neonatal death, increased morbidity and mortality in children and adults as well. They a significant health care and psychosocial burden for the patient, the family, the healthcare system and the community as a whole. Chromosomal abnormalities occur much more frequently than is generally appreciated. It is estimated that approximately 1 of 200 newborn infants had some form of chromosomal abnormality. The figure is much higher in fetuses that do not survive to term. It is estimated that in 50% of first trimester abortions, the fetus has a chromosomal abnormality. Aim of the study: This study aims to shed some light on the results of chromosomal studies per
... Show MoreWith the development of cloud computing during the latest years, data center networks have become a great topic in both industrial and academic societies. Nevertheless, traditional methods based on manual and hardware devices are burdensome, expensive, and cannot completely utilize the ability of physical network infrastructure. Thus, Software-Defined Networking (SDN) has been hyped as one of the best encouraging solutions for future Internet performance. SDN notable by two features; the separation of control plane from the data plane, and providing the network development by programmable capabilities instead of hardware solutions. Current paper introduces an SDN-based optimized Resch
This research aims to study the methods of reduction of dimensions that overcome the problem curse of dimensionality when traditional methods fail to provide a good estimation of the parameters So this problem must be dealt with directly . Two methods were used to solve the problem of high dimensional data, The first method is the non-classical method Slice inverse regression ( SIR ) method and the proposed weight standard Sir (WSIR) method and principal components (PCA) which is the general method used in reducing dimensions, (SIR ) and (PCA) is based on the work of linear combinations of a subset of the original explanatory variables, which may suffer from the problem of heterogeneity and the problem of linear
... Show MoreThe hydrological process has a dynamic nature characterised by randomness and complex phenomena. The application of machine learning (ML) models in forecasting river flow has grown rapidly. This is owing to their capacity to simulate the complex phenomena associated with hydrological and environmental processes. Four different ML models were developed for river flow forecasting located in semiarid region, Iraq. The effectiveness of data division influence on the ML models process was investigated. Three data division modeling scenarios were inspected including 70%–30%, 80%–20, and 90%–10%. Several statistical indicators are computed to verify the performance of the models. The results revealed the potential of the hybridized s
... Show MoreThis research sought to present a concept of cross-sectional data models, A crucial double data to take the impact of the change in time and obtained from the measured phenomenon of repeated observations in different time periods, Where the models of the panel data were defined by different types of fixed , random and mixed, and Comparing them by studying and analyzing the mathematical relationship between the influence of time with a set of basic variables Which are the main axes on which the research is based and is represented by the monthly revenue of the working individual and the profits it generates, which represents the variable response And its relationship to a set of explanatory variables represented by the
... Show MoreThe Atmospheric Infrared Sounder (AIRS) on EOS/Aqua satellite provides diverse measurements of Methane (CH4) distribution at different pressure levels in the Earth's atmosphere. The focus of this research is to analyze the vertical variations of (CH4) volume mixing ratio (VMR) time-series data at four Standard pressure levels SPL (925, 850, 600, and 300 hPa) in the troposphere above six cities in Iraq from January 2003 to September 2016. The analysis results of monthly average CH4VMR time-series data show a significant increase between 2003 and 2016, especially from 2009 to 2016; the minimum values of CH4 were in 2003 while the maximum values were in 2016. The vertical distribution of CH4<
... Show MoreTight reservoirs have attracted the interest of the oil industry in recent years according to its significant impact on the global oil product. Several challenges are present when producing from these reservoirs due to its low to extra low permeability and very narrow pore throat radius. Development strategy selection for these reservoirs such as horizontal well placement, hydraulic fracture design, well completion, and smart production program, wellbore stability all need accurate characterizations of geomechanical parameters for these reservoirs. Geomechanical properties, including uniaxial compressive strength (UCS), static Young’s modulus (Es), and Poisson’s ratio (υs), were measured experimentally using both static and dynamic met
... Show MoreThe investigation of machine learning techniques for addressing missing well-log data has garnered considerable interest recently, especially as the oil and gas sector pursues novel approaches to improve data interpretation and reservoir characterization. Conversely, for wells that have been in operation for several years, conventional measurement techniques frequently encounter challenges related to availability, including the lack of well-log data, cost considerations, and precision issues. This study's objective is to enhance reservoir characterization by automating well-log creation using machine-learning techniques. Among the methods are multi-resolution graph-based clustering and the similarity threshold method. By using cutti
... Show MoreThe partial level density PLD of pre-equilibrium reactions that are described by Ericson’s formula has been studied using different formulae of single particle level density . The parameter was used from the equidistant spacing model (ESM) model and the non- equidistant spacing model (non-ESM) and another formula of are derived from the relation between and level density parameter . The formulae used to derive are the Roher formula, Egidy formula, Yukawa formula, and Thomas –Fermi formula. The partial level density results that depend on from the Thomas-Fermi formula show a good agreement with the experimental data.