Cloud computing offers a new way of service provision by rearranging various resources over the Internet. The most important and popular cloud service is data storage. In order to preserve the privacy of data holders, data are often stored in cloud in an encrypted form. However, encrypted data introduce new challenges for cloud data deduplication, which becomes crucial for big data storage and processing in the cloud. Traditional deduplication schemes cannot work on encrypted data. Among these data, digital videos are fairly huge in terms of storage cost and size; and techniques that can help the legal aspects of video owner such as copyright protection and reducing the cloud storage cost and size are always desired. This paper focuses on video copyright protection and deduplication. A video copyright and deduplication scheme in cloud storage environments using the H.264 compression algorithm and SHA-512 hashing technique is proposed. This paper proposes a combined copyright production and deduplication based on video content to authenticate and to verify the integrity of the compressed H.264 video. The design of the proposed scheme consists of two modules. First, a H.264 compression algorithm is applied on the given video by the user. Second, a unique signature in different time interval of the compressed video is generated by the user in such a way that the CSP can use it to compare the user’s video against other videos without compromising the security of the user’s video. To avoid any attacker to gain access to the hash signature during uploading to the cloud, the hash signature is encrypted with the user password. Some experimental results are provided, showing the effectiveness of our proposed copyright protection and deduplication system.
This work reports the development of an analytical method for the simultaneous analysis of three fluoroquinolones; ciprofloxacin (CIP), norfloxacin (NOR) and ofloxacin (OFL) in soil matrix. The proposed method was performed by using microwave-assisted extraction (MAE), solid-phase extraction (SPE) for samples purification, and finally the pre-concentrated samples were analyzed by HPLC detector. In this study, various organic solvents were tested to extract the test compounds, and the extraction performance was evaluated by testing various parameters including extraction solvent, solvent volume, extraction time, temperature and number of the extraction cycles. The current method showed a good linearity over the concentration ranging from
... Show MoreThe present study aimed to identify the extent to which the content of social and national studies courses was included in interactive thinking maps in the educational stages in the Kingdom of Saudi Arabia, and to achieve the goal of the study, the researcher used the descriptive and analytical approach, and the study tool used consisted of a content analysis card; Where it included a list of the types of thinking maps, where the study sample consisted of all social and national studies courses at the elementary and intermediate levels, and it is (12) books for the student in its first and second parts, and after verifying the validity and reliability of the tool, it was applied to the study sample, and the study reached conclusions, inc
... Show MoreIn this work a study and calculation of the normal approach between two bodies, spherical and rough flat surface, had been conducted by the aid of image processing technique. Four kinds of metals of different work hardening index had been used as a surface specimens and by capturing images of resolution of 0.006565 mm/pixel a good estimate of the normal approach may be obtained the compression tests had been done in strength of material laboratory in mechanical engineering department, a Monsanto tensometer had been used to conduct the indentation tests.
A light section measuring equipment microscope BK 70x50 was used to calculate the surface parameters of the texture profile like standard deviation of asperity peak heights, centre lin
Learning the vocabulary of a language has great impact on acquiring that language. Many scholars in the field of language learning emphasize the importance of vocabulary as part of the learner's communicative competence, considering it the heart of language. One of the best methods of learning vocabulary is to focus on those words of high frequency. The present article is a corpus based approach to the study of vocabulary whereby the research data are analyzed quantitatively using the software program "AntWordprofiler". This program analyses new input research data in terms of already stored reliable corpora. The aim of this article is to find out whether the vocabularies used in the English textbook for Intermediate Schools in Iraq are con
... Show MoreIn this work a study and calculation of the normal approach between two bodies,
spherical and rough flat surface, had been conducted by the aid of image processing
technique. Four kinds of metals of different work hardening index had been used as a
surface specimens and by capturing images of resolution of 0.006565 mm/pixel a good estimate of the normal approach may be obtained the compression tests had been done in strength of material laboratory in mechanical engineering department, a Monsanto tensometer had been used to conduct the indentation tests. A light section measuring equipment microscope BK 70x50 was used to calculate the surface parameters of the texture profile like standard deviation of asperity peak heights
This research shows the issues of Ibn Hisham's illusion in its leadership of the grammarians; As Ibn Hisham attributed - during his presentation of grammatical issues - grammatical opinions to a number of grammarians claiming them in them, and after referring to the main concepts that pertain to those grammarians, we found that Ibn Hisham had delusional in those allegations, in addition to that clarifying the terms illusion and claim in the two circles of language And the terminology, and perhaps the most prominent result in this research is that he worked to investigate these issues by referring to their original sources, with an explanation of the illusions of Ibn Hisham in his attribution to these issues.
Central and Eastern European Online Library - CEE journals, documents, articles, periodicals, books available online for download, Zeitschrfitendatenbank, Online Zeitschriften, Online Zeitschriftendatenbank
The Normalized Difference Vegetation Index (NDVI) is commonly used as a measure of land surface greenness based on the assumption that NDVI value is positively proportional to the amount of green vegetation in an image pixel area. The Normalized Difference Vegetation Index data set of Landsat based on the remote sensing information is used to estimate the area of plant cover in region west of Baghdad during 1990-2001. The results show that in the period of 1990 and 2001 the plant area in region of Baghdad increased from (44760.25) hectare to (75410.67) hectare. The vegetation area increased during the period 1990-2001, and decreases the exposed area.