Data hiding is the process of encoding extra information in an image by making small modification to its pixels. To be practical, the hidden data must be perceptually invisible yet robust to common signal processing operations. This paper introduces a scheme for hiding a signature image that could be as much as 25% of the host image data and hence could be used both in digital watermarking as well as image/data hiding. The proposed algorithm uses orthogonal discrete wavelet transforms with two zero moments and with improved time localization called discrete slantlet transform for both host and signature image. A scaling factor ? in frequency domain control the quality of the watermarked images. Experimental results of signature image recovery after applying JPEG coding to the watermarking image are included.
This paper addresses the nature of Spatial Data Infrastructure (SDI), considered as one of the most important concepts to ensure effective functioning in a modern society. It comprises a set of continually developing methods and procedures providing the geospatial base supporting a country’s governmental, environmental, economic, and social activities. In general, the SDI framework consists of the integration of various elements including standards, policies, networks, data, and end users and application areas. The transformation of previously paper-based map data into a digital format, the emergence of GIS, and the Internet and a host of online applications (e.g., environmental impact analysis, navigation, applications of VGI dat
... Show MoreFinding orthogonal matrices in different sizes is very complex and important because it can be used in different applications like image processing and communications (eg CDMA and OFDM). In this paper we introduce a new method to find orthogonal matrices by using tensor products between two or more orthogonal matrices of real and imaginary numbers with applying it in images and communication signals processing. The output matrices will be orthogonal matrices too and the processing by our new method is very easy compared to other classical methods those use basic proofs. The results are normal and acceptable in communication signals and images but it needs more research works.
In this research, a factorial experiment (4*4) was studied, applied in a completely random block design, with a size of observations, where the design of experiments is used to study the effect of transactions on experimental units and thus obtain data representing experiment observations that The difference in the application of these transactions under different environmental and experimental conditions It causes noise that affects the observation value and thus an increase in the mean square error of the experiment, and to reduce this noise, multiple wavelet reduction was used as a filter for the observations by suggesting an improved threshold that takes into account the different transformation levels based on the logarithm of the b
... Show MoreAn in-depth experimental study of the matrix effect of antifreeze (ethylene glycol) and water contamination of engine oil through FT-IR spectroscopy. With a comparison of the percent by volume concentration of contaminated fresh 15W-40 engine oil, there appeared to be a noticeable reduction in the O–H stretching signal in the infrared spectrum when ethylene glycol based antifreeze was included as a contaminant. The contaminants of distilled water, a 50/50 mixture of water and commercial ethylene glycol antifreeze, and straight ethylene glycol antifreeze were compared and a signal reduction in the O–H stretch was clearly evident when glycol was present. Doubling the volume of the 50/50 mixture as compared to water alone still res
... Show MoreToday, there are large amounts of geospatial data available on the web such as Google Map (GM), OpenStreetMap (OSM), Flickr service, Wikimapia and others. All of these services called open source geospatial data. Geospatial data from different sources often has variable accuracy due to different data collection methods; therefore data accuracy may not meet the user requirement in varying organization. This paper aims to develop a tool to assess the quality of GM data by comparing it with formal data such as spatial data from Mayoralty of Baghdad (MB). This tool developed by Visual Basic language, and validated on two different study areas in Baghdad / Iraq (Al-Karada and Al- Kadhumiyah). The positional accuracy was asses
... Show MoreIt is often needed to have circuits that can display the decimal representation of a binary number and specifically in this paper on a 7-segment display. In this paper a circuit that can display the decimal equivalent of an n-bit binary number is designed and it’s behavior is described using Verilog Hardware Descriptive Language (HDL). This HDL program is then used to configure an FPGA to implement the designed circuit.
Big data usually running in large-scale and centralized key management systems. However, the centralized key management systems are increasing the problems such as single point of failure, exchanging a secret key over insecure channels, third-party query, and key escrow problem. To avoid these problems, we propose an improved certificate-based encryption scheme that ensures data confidentiality by combining symmetric and asymmetric cryptography schemes. The combination can be implemented by using the Advanced Encryption Standard (AES) and Elliptic Curve Diffie-Hellman (ECDH). The proposed scheme is an enhanced version of the Certificate-Based Encryption (CBE) scheme and preserves all its advantages. However
... Show MoreThis study aimed to investigate the role of Big Data in forecasting corporate bankruptcy and that is through a field analysis in the Saudi business environment, to test that relationship. The study found: that Big Data is a recently used variable in the business context and has multiple accounting effects and benefits. Among the benefits is forecasting and disclosing corporate financial failures and bankruptcies, which is based on three main elements for reporting and disclosing that, these elements are the firms’ internal control system, the external auditing, and financial analysts' forecasts. The study recommends: Since the greatest risk of Big Data is the slow adaptation of accountants and auditors to these technologies, wh
... Show MoreThe science of information security has become a concern of many researchers, whose efforts are trying to come up with solutions and technologies that ensure the transfer of information in a more secure manner through the network, especially the Internet, without any penetration of that information, given the risk of digital data being sent between the two parties through an insecure channel. This paper includes two data protection techniques. The first technique is cryptography by using Menezes Vanstone elliptic curve ciphering system, which depends on public key technologies. Then, the encoded data is randomly included in the frame, depending on the seed used. The experimental results, using a PSNR within avera
... Show More