"Watermarking" is one method in which digital information is buried in a carrier signal;
the hidden information should be related to the carrier signal. There are many different types of
digital watermarking, including traditional watermarking that uses visible media (such as snaps,
images, or video), and a signal may be carrying many watermarks. Any signal that can tolerate
noise, such as audio, video, or picture data, can have a digital watermark implanted in it. A digital
watermark must be able to withstand changes that can be made to the carrier signal in order to
protect copyright information in media files. The goal of digital watermarking is to ensure the
integrity of data, whereas steganography focuses on making information undetectable to humans.
Watermarking doesn't alter the original digital image, unlike public-key encryption, but rather
creates a new one with embedded secured aspects for integrity. There are no residual effects of
encryption on decrypted documents. This work focuses on strong digital image watermarking
algorithms for copyright protection purposes. Watermarks of various sorts and uses were
discussed, as well as a review of current watermarking techniques and assaults. The project shows
how to watermark an image in the frequency domain using DCT and DWT, as well as in the spatial
domain using the LSB approach. When it comes to noise and compression, frequency-domain
approaches are far more resilient than LSB. All of these scenarios necessitate the use of the original
picture to remove the watermark. Out of the three, the DWT approach has provided the best results.
We can improve the resilience of our watermark while having little to no extra influence on image
quality by embedding watermarks in these places.
Simple and sensitive batch and Flow-injection spectrophotometric methods for the determination of Procaine HCl in pure form and in injections were proposed. These methods were based on a diazotization reaction of procaine HCl with sodium nitrite and hydrochloric acid to form diazonium salt, which is coupled with chromatropic acid in alkaline medium to form an intense pink water-soluble dye that is stable and has a maximum absorption at 508 nm. A graphs of absorbance versus concentration show that Beer’s law is obeyed over the concentration range of 1-40 and 5-400 µg.ml-1 of Procaine HCl, with detection limits of 0.874 and 3.75 µg.ml-1 of Procaine HCl for batch and FIA methods respectively. The FIA average sample throughput was 70 h-1. A
... Show MoreImage quality plays a vital role in improving and assessing image compression performance. Image compression represents big image data to a new image with a smaller size suitable for storage and transmission. This paper aims to evaluate the implementation of the hybrid techniques-based tensor product mixed transform. Compression and quality metrics such as compression-ratio (CR), rate-distortion (RD), peak signal-to-noise ratio (PSNR), and Structural Content (SC) are utilized for evaluating the hybrid techniques. Then, a comparison between techniques is achieved according to these metrics to estimate the best technique. The main contribution is to improve the hybrid techniques. The proposed hybrid techniques are consisting of discrete wavel
... Show MoreIt is the regression analysis is the foundation stone of knowledge of statistics , which mostly depends on the ordinary least square method , but as is well known that the way the above mentioned her several conditions to operate accurately and the results can be unreliable , add to that the lack of certain conditions make it impossible to complete the work and analysis method and among those conditions are the multi-co linearity problem , and we are in the process of detected that problem between the independent variables using farrar –glauber test , in addition to the requirement linearity data and the lack of the condition last has been resorting to the
... Show MoreWith the escalation of cybercriminal activities, the demand for forensic investigations into these crimeshas grown significantly. However, the concept of systematic pre-preparation for potential forensicexaminations during the software design phase, known as forensic readiness, has only recently gainedattention. Against the backdrop of surging urban crime rates, this study aims to conduct a rigorous andprecise analysis and forecast of crime rates in Los Angeles, employing advanced Artificial Intelligence(AI) technologies. This research amalgamates diverse datasets encompassing crime history, varioussocio-economic indicators, and geographical locations to attain a comprehensive understanding of howcrimes manifest within the city. Lev
... Show MoreThis research include design and implementation of an Iraqi cities database using spatial data structure for storing data in two or more dimension called k-d tree .The proposed system should allow records to be inserted, deleted and searched by name or coordinate. All the programming of the proposed system written using Delphi ver. 7 and performed on personal computer (Intel core i3).
This article aims to provide a bibliometric analysis of intellectual capital research published in the Scopus database from 1956 to 2020 to trace the development of scientific activities that can pave the way for future studies by shedding light on the gaps in the field. The analysis focuses on 638 intellectual capital-related papers published in the Scopus database over 60 years, drawing upon a bibliometric analysis using VOSviewer. This paper highlights the mainstream of the current research in the intellectual capital field, based on the Scopus database, by presenting a detailed bibliometric analysis of the trend and development of intellectual capital research in the past six decades, including journals, authors, countries, inst
... Show MoreWith the growth of the use mobile phones, people have become increasingly interested in using Short Message Services (SMS) as the most suitable communications service. The popularity of SMS has also given rise to SMS spam, which refers to any unwanted message sent to a mobile phone as a text. Spam may cause many problems, such as traffic bottlenecks or stealing important users' information. This paper, presents a new model that extracts seven features from each message before applying a Multiple Linear Regression (MLR) to assign a weight to each of the extracted features. The message features are fed into the Extreme Learning Machine (ELM) to determine whether they are spam or ham. To evaluate the proposed model, the UCI bench
... Show More