In recent years, the performance of Spatial Data Infrastructures for governments and companies is a task that has gained ample attention. Different categories of geospatial data such as digital maps, coordinates, web maps, aerial and satellite images, etc., are required to realize the geospatial data components of Spatial Data Infrastructures. In general, there are two distinct types of geospatial data sources exist over the Internet: formal and informal data sources. Despite the growth of informal geospatial data sources, the integration between different free sources is not being achieved effectively. The adoption of this task can be considered the main advantage of this research. This article addresses the research question of how the integration of free geospatial data can be beneficial within domains such as Spatial Data Infrastructures. This was carried out by suggesting a common methodology that uses road networks information such as lengths, centeroids, start and end points, number of nodes and directions to integrate free and open source geospatial datasets. The methodology has been proposed for a particular case study: the use of geospatial data from OpenStreetMap and Google Earth datasets as examples of free data sources. The results revealed possible matching between the roads of OpenStreetMap and Google Earth datasets to serve the development of Spatial Data Infrastructures.
Coherent noise such as ground roll and guided wave is present in the seismic line DE21 (East Diwaniya south eastern Iraq) and it obscures seismic signal and degrades from the quality of data. To attenuate the coherent noise from the shot gather and the Stack of the seismic line, AGORA filter was applied in order to obtain the good signal as the hyperbola shape in shot gather and the reflectors will be clearly to interpret it later. It has given good results and the coherent noise was attenuated in high ratio on all the line. The spectrum analysis is confirmed the effectiveness of AGORA filter to attenuate the coherent noise.
The frequency dependent noise attenuation (FDNAT) filter was applied on 2D seismic data line DE21 in east Diwaniya, south eastern Iraq to improve the signal to noise ratio. After applied FDNAT on the seismic data, it gives good results and caused to remove a lot of random noise. This processing is helpful in enhancement the picking of the signal of the reflectors and therefore the interpretation of data will be easy later. The quality control by using spectrum analysis is used as a quality factor in proving the effects of FDNAT filter to remove the random noise.
In this paper, a new high-performance lossy compression technique based on DCT is proposed. The image is partitioned into blocks of a size of NxN (where N is multiple of 2), each block is categorized whether it is high frequency (uncorrelated block) or low frequency (correlated block) according to its spatial details, this done by calculating the energy of block by taking the absolute sum of differential pulse code modulation (DPCM) differences between pixels to determine the level of correlation by using a specified threshold value. The image blocks will be scanned and converted into 1D vectors using horizontal scan order. Then, 1D-DCT is applied for each vector to produce transform coefficients. The transformed coefficients will be qua
... Show MoreIn this study, SnO2 nanoparticles were prepared from cost-low tin chloride (SnCl2.2H2O) and ethanol by adding ammonia solution by the sol-gel method, which is one of the lowest-cost and simplest techniques. The SnO2 nanoparticles were dried in a drying oven at a temperature of 70°C for 7 hours. After that, it burned in an oven at a temperature of 200°C for 24 hours. The structure, material, morphological, and optical properties of the synthesized SnO2 in nanoparticle sizes are studied utilizing X-ray diffraction. The Scherrer expression was used to compute nanoparticle sizes according to X-ray diffraction, and the results needed to be scrutinized more closely. The micro-strain indi
... Show MoreIn this study, SnO2 nanoparticles were prepared from cost-low tin chloride (SnCl2.2H2O) and ethanol by adding ammonia solution by the sol-gel method, which is one of the lowest-cost and simplest techniques. The SnO2 nanoparticles were dried in a drying oven at a temperature of 70°C for 7 hours. After that, it burned in an oven at a temperature of 200°C for 24 hours. The structure, material, morphological, and optical properties of the synthesized SnO2 in nanoparticle sizes are studied utilizing X-ray diffraction. The Scherrer expression was used to compute nanoparticle sizes according to X-ray diffraction, and the results needed to be scrutinized more closely. The micro-strain indicates the broadening of diffraction peaks for nano
... Show MoreThe climate changes had been recognized as one of the major factors responsible for land degradation, which has a significant impact on diverse aspects. The present study aims to estimate how the climate change can influence land degradation in the south areas of Baghdad province (Al-Rasheed, Al-Mahmudiyah, Al-Yusufiyah, Al-Madaen, and Al-Latifiyah). The Satellite Landsat-8 OLI and satellite Landsat-5 TM sensor imagery were used to extent land degradation for the period (2010-2019). ArcGIS V.10.4 was applied to manage and analysis the satellite image dataset, including the use of climate factors data from the European Center for Climate Forecasts (ECMWF) by reanalyzes and extraction datasets. To achieve work objectives, many
... Show MoreTau-P linear noise attenuation filter (TPLNA) was applied on the 3D seismic data of Al-Samawah area south west of Iraq with the aim of attenuating linear noise. TPLNA transforms the data from time domain to tau-p domain in order to increase signal to noise ratio. Applying TPLNA produced very good results considering the 3D data that usually have a large amount of linear noise from different sources and in different azimuths and directions. This processing is very important in later interpretation due to the fact that the signal was covered by different kinds of noise in which the linear noise take a large part.
Governmental establishments are maintaining historical data for job applicants for future analysis of predication, improvement of benefits, profits, and development of organizations and institutions. In e-government, a decision can be made about job seekers after mining in their information that will lead to a beneficial insight. This paper proposes the development and implementation of an applicant's appropriate job prediction system to suit his or her skills using web content classification algorithms (Logit Boost, j48, PART, Hoeffding Tree, Naive Bayes). Furthermore, the results of the classification algorithms are compared based on data sets called "job classification data" sets. Experimental results indicate
... Show More