With the fast progress of information technology and the computer networks, it becomes very easy to reproduce and share the geospatial data due to its digital styles. Therefore, the usage of geospatial data suffers from various problems such as data authentication, ownership proffering, and illegal copying ,etc. These problems can represent the big challenge to future uses of the geospatial data. This paper introduces a new watermarking scheme to ensure the copyright protection of the digital vector map. The main idea of proposed scheme is based on transforming the digital map to frequently domain using the Singular Value Decomposition (SVD) in order to determine suitable areas to insert the watermark data. The digital map is separated into the isolated parts.Watermark data are embedded within the nominated magnitudes in each part when satisfied the definite criteria. The efficiency of proposed watermarking scheme is assessed within statistical measures based on two factors which are fidelity and robustness. Experimental results demonstrate the proposed watermarking scheme representing ideal trade off for disagreement issue between distortion amount and robustness. Also, the proposed scheme shows robust resistance for many kinds of attacks.
In this paper, visible image watermarking algorithm based on biorthogonal wavelet
transform is proposed. The watermark (logo) of type binary image can be embedded in the
host gray image by using coefficients bands of the transformed host image by biorthogonal
transform domain. The logo image can be embedded in the top-left corner or spread over the
whole host image. A scaling value (α) in the frequency domain is introduced to control the
perception of the watermarked image. Experimental results show that this watermark
algorithm gives visible logo with and no losses in the recovery process of the original image,
the calculated PSNR values support that. Good robustness against attempt to remove the
watermark was s
In this research، a comparison has been made between the robust estimators of (M) for the Cubic Smoothing Splines technique، to avoid the problem of abnormality in data or contamination of error، and the traditional estimation method of Cubic Smoothing Splines technique by using two criteria of differentiation which are (MADE، WASE) for different sample sizes and disparity levels to estimate the chronologically different coefficients functions for the balanced longitudinal data which are characterized by observations obtained through (n) from the independent subjects، each one of them is measured repeatedly by group of specific time points (m)،since the frequent measurements within the subjects are almost connected an
... Show MoreRecent developments in technology and the area of digital communications have rendered digital images increasingly susceptible to tampering and alteration by persons who are not authorized to do so. This may appear to be acceptable, especially if an image editing process is necessary to delete or add a particular scene that improves the quality the image. But what about images used in authorized governmental transactions? The consequences can be severe; any altered document is considered forged under the law and may cause confusion. Also, any document that cannot be verified as being authentic is regarded as a fake and cannot be used, inflicting harm on people. The suggested work intends to reduce fraud in electronic documents u
... Show MoreWitnessing human societies with the turn of the century atheist twenty huge revolution in information , the result of scientific and technological developments rapidly in space science and communications , and that made the whole world is like a small village not linked by road as it was in ancient times, through the rapid transportation as was the case a few years ago , thanks to the remote sensing devices that roam in space observant everything on the ground , that the information networks that overflowed the world a tremendous amount of information provided for each inhabitants of the earth , which made this information requirement for human life and human survival and well-being , as it has allowed that information to humans opportun
... Show MoreThis paper discusses using H2 and H∞ robust control approaches for designing control systems. These approaches are applied to elementary control system designs, and their respective implementation and pros and cons are introduced. The H∞ control synthesis mainly enforces closed-loop stability, covering some physical constraints and limitations. While noise rejection and disturbance attenuation are more naturally expressed in performance optimization, which can represent the H2 control synthesis problem. The paper also applies these two methodologies to multi-plant systems to study the stability and performance of the designed controllers. Simulation results show that the H2 controller tracks a desirable cl
... Show MoreData security is a fundamental parameter on communication system development. The capability of protecting and securing the information is a great essence for the growth of the data security and electronic commerce. The cryptography has a significant influence upon information security systems against the variety of the attacks, in which higher complexity in secret keys results in the increase of security and the cryptography algorithms’ complexity. The sufficient and newer cryptographic methods’ versions may helpful in the reduction of the security attacks. The main aim of this research is satisfying the purpose of the information security through the addition of a new security level to the Advanced Encryption Standard (AES) algorithm
... Show MoreThe paired sample t-test is a type of classical test statistics that is used to test the difference between two means in paired data, but it is not robust against the violation of the normality assumption. In this paper, some alternative robust tests are suggested by combining the Jackknife resampling with each of the Wilcoxon signed-rank test for small sample size and Wilcoxon signed-rank test for large sample size, using normal approximation. The Monte Carlo simulation experiments were employed to study the performance of the test statistics of each of these tests depending on the type one error rates and the power rates of the test statistics. All these tests were applied on different sa
... Show MoreHuman Interactive Proofs (HIPs) are automatic inverse Turing tests, which are intended to differentiate between people and malicious computer programs. The mission of making good HIP system is a challenging issue, since the resultant HIP must be secure against attacks and in the same time it must be practical for humans. Text-based HIPs is one of the most popular HIPs types. It exploits the capability of humans to recite text images more than Optical Character Recognition (OCR), but the current text-based HIPs are not well-matched with rapid development of computer vision techniques, since they are either vey simply passed or very hard to resolve, thus this motivate that
... Show MoreSimulation Study
Abstract :
Robust statistics Known as, Resistance to mistakes resulting of the deviation of Check hypotheses of statistical properties ( Adjacent Unbiased , The Efficiency of data taken from a wide range of probability distributions follow a normal distribution or a mixture of other distributions with different standard deviations.
power spectrum function lead to, President role in the analysis of Stationary random processes, organized according to time, may be discrete random variables or continuous. Measuring its total capacity as frequency function.
Estimation methods Share with
... Show MoreIn recent years, the number of applications utilizing mobile wireless sensor networks (WSNs) has increased, with the intent of localization for the purposes of monitoring and obtaining data from hazardous areas. Location of the event is very critical in WSN, as sensing data is almost meaningless without the location information. In this paper, two Monte Carlo based localization schemes termed MCL and MSL* are studied. MCL obtains its location through anchor nodes whereas MSL* uses both anchor nodes and normal nodes. The use of normal nodes would increase accuracy and reduce dependency on anchor nodes, but increases communication costs. For this reason, we introduce a new approach called low communication cost schemes to reduce communication
... Show More