Cloud storage provides scalable and low cost resources featuring economies of scale based on cross-user architecture. As the amount of data outsourced grows explosively, data deduplication, a technique that eliminates data redundancy, becomes essential. The most important cloud service is data storage. In order to protect the privacy of data owner, data are stored in cloud in an encrypted form. However, encrypted data introduce new challenges for cloud data deduplication, which becomes crucial for data storage. Traditional deduplication schemes cannot work on encrypted data. Existing solutions of encrypted data deduplication suffer from security weakness. This paper proposes a combined compressive sensing and video deduplication to maximize deduplication ratios. Our approach uses data deduplication to remove identical copies of the video. Our experimental results show significant storage savings, while providing strong level security
Random matrix theory is used to study the chaotic properties in nuclear energy spectrum of the 24Mg nucleus. The excitation energies (which are the main object of this study) are obtained via performing shell model calculations using the OXBASH computer code together with an effective interaction of Wildenthal (W) in the isospin formalism. The 24Mg nucleus is assumed to have an inert 16O core with 8 nucleons (4protons and 4neutrons) move in the 1d5/2, 2s1/2 and 1d3/2 orbitals. The spectral fluctuations are studied by two statistical measures: the nearest neighb
In this research study theory to find the stress and emotion gases in the glass as a result of exposure to pulses of the laser beam has been the study using vehicles three major on-system axes cylindrical (r, 0, z), where I took three models of glass silica glass soda glass fused and shedtwo types of lasers where the study showed that the thermal stresses and emotions ...
In this work, we carried out an experimental study of thedusty
plasma by taking the dust material Fe3O4 with radius of the any grain
0.1μm - 0.5μm. In experiment we use air in the vacuum chamber
system under different low pressure (0.1-1) Torr. The results
illustrated that the present of dust particles in the air plasma did not
effect on Paschen minimum which is 0.5 without dust and with Fe3O4
dusty grains.
The effect of Fe3O4 dust particles on plasma parameters can be
notice in direct current system in glow discharge region. The plasma
parameters which were studied in this work represent plasma
potential, floating potential,electron saturation current, temperatu
Noor oil field is one of smallest fields in Missan province. Twelve well penetrates the Mishrif Formation in Noor field and eight of them were selected for this study. Mishrif formation is one of the most important reservoirs in Noor field and it consists of one anticline dome and bounded by the Khasib formation at the top and the Rumaila formation at the bottom. The reservoir was divided into eight units separated by isolated units according to partition taken by a rounding fields.
In this paper histograms frequency distribution of the porosity, permeability, and water saturation were plotted for MA unit of Mishrif formation in Noor field, and then transformed to the normal distribution by applying the Box-Cox transformation alg
... Show MoreShear wave velocity is an important feature in the seismic exploration that could be utilized in reservoir development strategy and characterization. Its vital applications in petrophysics, seismic, and geomechanics to predict rock elastic and inelastic properties are essential elements of good stability and fracturing orientation, identification of matrix mineral and gas-bearing formations. However, the shear wave velocity that is usually obtained from core analysis which is an expensive and time-consuming process and dipole sonic imager tool is not commonly available in all wells. In this study, a statistical method is presented to predict shear wave velocity from wireline log data. The model concentrated to predict shear wave velocity fr
... Show MoreThe development of information systems in recent years has contributed to various methods of gathering information to evaluate IS performance. The most common approach used to collect information is called the survey system. This method, however, suffers one major drawback. The decision makers consume considerable time to transform data from survey sheets to analytical programs. As such, this paper proposes a method called ‘survey algorithm based on R programming language’ or SABR, for data transformation from the survey sheets inside R environments by treating the arrangement of data as a relational format. R and Relational data format provide excellent opportunity to manage and analyse the accumulated data. Moreover, a survey syste
... Show MoreThe current study includes preparing a geometric proposal of the main parameters that must be worked within a seismic reflection survey to prepare a three-dimensional subsurface image. This image represents the Siba oil field located in Basra, southern Iraq. The results were based on two options for selecting these approved elements to create a three-dimensional image of the Mishrif, Zubair and Yamama formations as well as the Jurassic and Permian Khuff and the pre-Khuff reservoir area. The first option is represented in the geometry in option -1 is 12 lines, 6 shots, and 216 chs. The receiver density is 66.67 receivers / km2, so the shot density is the same. Total shots are 21000, which is the same number of receiv
... Show MoreData scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for