Cloud storage provides scalable and low cost resources featuring economies of scale based on cross-user architecture. As the amount of data outsourced grows explosively, data deduplication, a technique that eliminates data redundancy, becomes essential. The most important cloud service is data storage. In order to protect the privacy of data owner, data are stored in cloud in an encrypted form. However, encrypted data introduce new challenges for cloud data deduplication, which becomes crucial for data storage. Traditional deduplication schemes cannot work on encrypted data. Existing solutions of encrypted data deduplication suffer from security weakness. This paper proposes a combined compressive sensing and video deduplication to maximize deduplication ratios. Our approach uses data deduplication to remove identical copies of the video. Our experimental results show significant storage savings, while providing strong level security
The dependence of the cross-section of the coherent and incoherent radiation peaks in the X-ray absorption experiment of different energies (20-800 Kev) was investigated. Cross-sectional dependence on the atomic number Z was included from the published data for (8) elements, ranging from carbon to silver (C-Ag). The proportional constant K was obtained between (σc/σi), with the atomic number Z from (6-47). The results show that the value of K exponentially changes with energy.
n Segmented Optical Telescope (NGST) with hexagonal segment of spherical primary mirror can provide a 3 arc minutes field of view. Extremely Large Telescopes (ELT) in the 100m dimension would have such unprecedented scientific effectiveness that their construction would constitute a milestone comparable to that of the invention of the telescope itself and provide a truly revolutionary insight into the universe. The scientific case and the conceptual feasibility of giant filled aperture telescopes was our interested. Investigating the requirements of these imply for possible technical options in the case of a 100m telescope. For this telescope the considerable interest is the correction of the optical aberrations for the coming wavefront, th
... Show MoreA computational investigation has been carried out to describe synthesis optimization procedure of magnetic lenses. The research is concentrated on the determination of the inverse design of the symmetrical double polepiece magnetic lenses whose magnetic field distribution is already defined. Magnetic lenses field model well known in electron optics have been used as the axial magnetic field distribution. This field has been studied when the halfwidth is variable and the maximum magnetic flux density is kept constant. The importance of this research lies in the possibility of using the present synthesis optimization procedure for finding the polepieces design of symmetrical double polepiece magnetic lenses which have the best proje
... Show MoreThe main objective and primary concern to every investor not only to achieve a greater return on his or her investments, but also to create the largest possible value of these investments the, researchers and those interested in the field of investment and financial analysis try to develop standards for performance valuation is guided through the  
... Show MoreThe concept of the optical telescope is the primary mirror design, the Next Generation Segmented Optical Telescope (NGST) with hexagonal segment of spherical primary mirror can provide a 3 arc minutes field of view. Extremely Large Telescopes (ELT) in the 100m dimension would have such unprecedented scientific effectiveness that their construction would constitute a milestone comparable to that of the invention of the telescope itself and provide a truly revolutionary insight into the universe. The scientific case and the conceptual feasibility of giant filled aperture telescopes was our interested. Investigating the requirements of these imply for possible technical options in the case of a 100m telescope. For this telescope the conside
... Show MoreThis research aims at studying the relation between fair value and the Financial Reports Quality to achieve a number of aims such as :-
1- Throw light on the problems of the measurement that depends on the historic cost as it paves the way towards the method of the fair value in the accounting measurement.
2-Give a general definition for fair value in the accounting via analyzing the theoretical aspects that relates the subject and the scientific bases on which the relating accounting treatment depend.
3- Exhibit the characteristics that could be added by the fair value to the accounting Information .
The study problem is summarized in that the e
... Show MoreIn this study, the effect of pumping power on the conversion efficiency of nonlinear crystal (KTP) was investigated using laser pump-power technique. The results showed that the higher the pumping power values, the greater the conversion efficiency (η) and, as the crystal thickness increases within limitations, the energy conversion efficiency increases at delay time of (0.333 ns) and at room temperature. Efficiency of 80% at length of KTP crystal (L = 1.75 X 10-3 m) and Pin = 28MW, and also, compare the experimental results with numerical results by using MATLAB program.
In this paper, we introduce a DCT based steganographic method for gray scale images. The embedding approach is designed to reach efficient tradeoff among the three conflicting goals; maximizing the amount of hidden message, minimizing distortion between the cover image and stego-image,and maximizing the robustness of embedding. The main idea of the method is to create a safe embedding area in the middle and high frequency region of the DCT domain using a magnitude modulation technique. The magnitude modulation is applied using uniform quantization with magnitude Adder/Subtractor modules. The conducted test results indicated that the proposed method satisfy high capacity, high preservation of perceptual and statistical properties of the steg
... Show MoreBreast cancer was one of the most common reasons for death among the women in the world. Limited awareness of the seriousness of this disease, shortage number of specialists in hospitals and waiting the diagnostic for a long period time that might increase the probability of expansion the injury cases. Consequently, various machine learning techniques have been formulated to decrease the time taken of decision making for diagnoses the breast cancer and that might minimize the mortality rate. The proposed system consists of two phases. Firstly, data pre-processing (data cleaning, selection) of the data mining are used in the breast cancer dataset taken from the University of California, Irvine machine learning repository in this stage we
... Show More