With the fast progress of information technology and the computer networks, it becomes very easy to reproduce and share the geospatial data due to its digital styles. Therefore, the usage of geospatial data suffers from various problems such as data authentication, ownership proffering, and illegal copying ,etc. These problems can represent the big challenge to future uses of the geospatial data. This paper introduces a new watermarking scheme to ensure the copyright protection of the digital vector map. The main idea of proposed scheme is based on transforming the digital map to frequently domain using the Singular Value Decomposition (SVD) in order to determine suitable areas to insert the watermark data. The digital map is separated into the isolated parts.Watermark data are embedded within the nominated magnitudes in each part when satisfied the definite criteria. The efficiency of proposed watermarking scheme is assessed within statistical measures based on two factors which are fidelity and robustness. Experimental results demonstrate the proposed watermarking scheme representing ideal trade off for disagreement issue between distortion amount and robustness. Also, the proposed scheme shows robust resistance for many kinds of attacks.
In-situ measurements of ultraviolet (UV) and solar irradiance is very sparse in Nigeria because of cost; it is estimated using meteorological parameters. In this work, a low-cost UV and pyranometer device, using locally sourced materials, was developed. The instrument consists of a UV sensor (ML8511), a photodiode (BPW34) housed in a carefully sealed vacuumed glass bulb, the UV and solar irradiance sensor amplifiers, a 16-bit analog-to-digital converter (ADS1115), Arduino mega 2560, liquid crystal display (LCD) and microSD card for data logging. The designed amplifier has an offset voltage of 0.8676 mV. The sensitivity of the irradiance device is 86.819 Wm-2/mV with a correcting factor of 27.77 Wm-2 and a
... Show MoreThis study deals with the processing of field seismic data for a seismic line located within the administrative boundaries of Najaf and Muthanna governorates in southern Iraq (7Gn 21) with a length of 54 km. The study was conducted within the Processing Department of the Oil Exploration Company using the Omega system, which contains a large number of programs that deal with processing, through the use of these programs applied predictive deconvolution of both( gap) and (spike). The final section was produced for both types. The gap predictive deconvolution gave improvement in the shallow reflectors while in deep reflectors it did not give a good improvement, thus giving a good continuity of the reflectors at
... Show MoreIt has increasingly been recognised that the future developments in geospatial data handling will centre on geospatial data on the web: Volunteered Geographic Information (VGI). The evaluation of VGI data quality, including positional and shape similarity, has become a recurrent subject in the scientific literature in the last ten years. The OpenStreetMap (OSM) project is the most popular one of the leading platforms of VGI datasets. It is an online geospatial database to produce and supply free editable geospatial datasets for a worldwide. The goal of this paper is to present a comprehensive overview of the quality assurance of OSM data. In addition, the credibility of open source geospatial data is discussed, highlight
... Show MoreIn the last decade, the web has rapidly become an attractive platform, and an indispensable part of our lives. Unfortunately, as our dependency on the web increases so programmers focus more on functionality and appearance than security, has resulted in the interest of attackers in exploiting serious security problems that target web applications and web-based information systems e.g. through an SQL injection attack. SQL injection in simple terms, is the process of passing SQL code into interactive web applications that employ database services such applications accept user input such as form and then include this input in database requests, typically SQL statements in a way that was not intende
... Show MoreData centric techniques, like data aggregation via modified algorithm based on fuzzy clustering algorithm with voronoi diagram which is called modified Voronoi Fuzzy Clustering Algorithm (VFCA) is presented in this paper. In the modified algorithm, the sensed area divided into number of voronoi cells by applying voronoi diagram, these cells are clustered by a fuzzy C-means method (FCM) to reduce the transmission distance. Then an appropriate cluster head (CH) for each cluster is elected. Three parameters are used for this election process, the energy, distance between CH and its neighbor sensors and packet loss values. Furthermore, data aggregation is employed in each CH to reduce the amount of data transmission which le
... Show MoreData mining is one of the most popular analysis methods in medical research. It involves finding patterns and correlations in previously unknown datasets. Data mining encompasses various areas of biomedical research, including data collection, clinical decision support, illness or safety monitoring, public health, and inquiry research. Health analytics frequently uses computational methods for data mining, such as clustering, classification, and regression. Studies of large numbers of diverse heterogeneous documents, including biological and electronic information, provided extensive material to medical and health studies.
Loanwords are the words transferred from one language to another, which become essential part of the borrowing language. The loanwords have come from the source language to the recipient language because of many reasons. Detecting these loanwords is complicated task due to that there are no standard specifications for transferring words between languages and hence low accuracy. This work tries to enhance this accuracy of detecting loanwords between Turkish and Arabic language as a case study. In this paper, the proposed system contributes to find all possible loanwords using any set of characters either alphabetically or randomly arranged. Then, it processes the distortion in the pronunciation, and solves the problem of the missing lette
... Show MoreThyroid disease is a common disease affecting millions worldwide. Early diagnosis and treatment of thyroid disease can help prevent more serious complications and improve long-term health outcomes. However, thyroid disease diagnosis can be challenging due to its variable symptoms and limited diagnostic tests. By processing enormous amounts of data and seeing trends that may not be immediately evident to human doctors, Machine Learning (ML) algorithms may be capable of increasing the accuracy with which thyroid disease is diagnosed. This study seeks to discover the most recent ML-based and data-driven developments and strategies for diagnosing thyroid disease while considering the challenges associated with imbalanced data in thyroid dise
... Show MoreThe conventional procedures of clustering algorithms are incapable of overcoming the difficulty of managing and analyzing the rapid growth of generated data from different sources. Using the concept of parallel clustering is one of the robust solutions to this problem. Apache Hadoop architecture is one of the assortment ecosystems that provide the capability to store and process the data in a distributed and parallel fashion. In this paper, a parallel model is designed to process the k-means clustering algorithm in the Apache Hadoop ecosystem by connecting three nodes, one is for server (name) nodes and the other two are for clients (data) nodes. The aim is to speed up the time of managing the massive sc
... Show MoreThe reduction to pole of the aeromagnetic map of the western desert of Iraq has been used to outline the main basement structural features. Three selected magnetic anomalies are used to determine the depths of their magnetic sources. The estimated depths are obtained by using slope half slope method and have been corrected through the application of a published nomogram. These depths are compared with previous published depth values which provide a new look at the basement of the western desert in addition to the thickness map of the Paleozoic formations. The results shed light on the important of the great depths of the basement structures and in turn the sedimentary cover to be considered for future hydrocarbon exploration