Cloud computing provides huge amount of area for storage of the data, but with an increase of number of users and size of their data, cloud storage environment faces earnest problem such as saving storage space, managing this large data, security and privacy of data. To save space in cloud storage one of the important methods is data deduplication, it is one of the compression technique that allows only one copy of the data to be saved and eliminate the extra copies. To offer security and privacy of the sensitive data while supporting the deduplication, In this work attacks that exploit the hybrid cloud deduplication have been identified, allowing an attacker to gain access to the files of other users based on very small hash signatures of these files. More specifically, an attacker who knows the hash signature of a file can convince the storage service that he/she owns that file, hence the server lets the attacker to download the entire file. To overcome such attacks,the hash signature is encrypted with the user password. As a proof of concept a prototype of the proposed authorized deduplicate is implemented and conducted the test bed experiments using the prototype. Performance measurements indicate that the proposed Deduplication system incurs minimal overhead in the context of uploading, bandwidth compared to native deduplication.
The tight gas is one of the main types of the unconventional gas. Typically the tight gas reservoirs consist of highly heterogeneous low permeability reservoir. The economic evaluation for the production from tight gas production is very challenging task because of prevailing uncertainties associated with key reservoir properties, such as porosity, permeability as well as drainage boundary. However one of the important parameters requiring in this economic evaluation is the equivalent drainage area of the well, which relates the actual volume of fluids (e.g gas) produced or withdrawn from the reservoir at a certain moment that changes with time. It is difficult to predict this equival
In this study, we review the ARIMA (p, d, q), the EWMA and the DLM (dynamic linear moodelling) procedures in brief in order to accomdate the ac(autocorrelation) structure of data .We consider the recursive estimation and prediction algorithms based on Bayes and KF (Kalman filtering) techniques for correlated observations.We investigate the effect on the MSE of these procedures and compare them using generated data.
Data compression offers an attractive approach to reducing communication costs using available bandwidth effectively. It makes sense to pursue research on developing algorithms that can most effectively use available network. It is also important to consider the security aspect of the data being transmitted is vulnerable to attacks. The basic aim of this work is to develop a module for combining the operation of compression and encryption on the same set of data to perform these two operations simultaneously. This is achieved through embedding encryption into compression algorithms since both cryptographic ciphers and entropy coders bear certain resemblance in the sense of secrecy. First in the secure compression module, the given text is p
... Show MoreThe main problem of this research is the delay of implementation of the investment plan projects for the period (2013-2016) and the weakness of the staff ability in the ministries and they don’t have the sufficient experience to carry out the implementation process.
Therefore, the research aims to evaluate the implementation of programs and projects of the investment plan in a manner consistent with the objectives set for them without any wasteful of efforts, time and money. And then identify the problems and obstacles to determine the deviations of the implementation of the specific for each sector according to the criteria of evaluation and the form of cost, quality, time and implementation.<
... Show MoreThis paper deals to how to estimate points non measured spatial data when the number of its terms (sample spatial) a few, that are not preferred for the estimation process, because we also know that whenever if the data is large, the estimation results of the points non measured to be better and thus the variance estimate less, so the idea of this paper is how to take advantage of the data other secondary (auxiliary), which have a strong correlation with the primary data (basic) to be estimated single points of non-measured, as well as measuring the variance estimate, has been the use of technique Co-kriging in this field to build predictions spatial estimation process, and then we applied this idea to real data in th
... Show MoreThe systems cooling hybrid solar uses solar collector to convert solar energy into the source of heat for roasting Refrigerant outside of the compressor and this process helps in the transformation of Refrigerant from the gas to a liquid state in two-thirds the top of the condenser instead of two-thirds the bottom of the condenser as in Conventional cooling systems and this in turn reduces the energy necessary to lead the process of cooling. The system cooling hybrid use with a capacity of 1 ton and Refrigerant type R22 and the value of current drawn by the system limits (3.9-4.2A), the same value of electric current calculated by the system are Conventional within this atmosphere of Iraq, and after taking different readings
... Show MoreTV medium derives its formal shape from the technological development taking place in all scientific fields, which are creatively fused in the image of the television, which consists mainly of various visual levels and formations. But by the new decade of the second millennium, the television medium and mainly (drama) became looking for that paradigm shift in the aesthetic formal innovative fields and the advanced expressive performative fields that enable it to develop in treating what was impossible to visualize previously. In the meantime, presenting what is new and innovative in the field of unprecedented and even the familiar objective and intellectual treatments. Thus the TV medium has sought for work
... Show MoreFeature selection (FS) constitutes a series of processes used to decide which relevant features/attributes to include and which irrelevant features to exclude for predictive modeling. It is a crucial task that aids machine learning classifiers in reducing error rates, computation time, overfitting, and improving classification accuracy. It has demonstrated its efficacy in myriads of domains, ranging from its use for text classification (TC), text mining, and image recognition. While there are many traditional FS methods, recent research efforts have been devoted to applying metaheuristic algorithms as FS techniques for the TC task. However, there are few literature reviews concerning TC. Therefore, a comprehensive overview was systematicall
... Show MorePreparation of epoxy/MgO and epoxy/SiO2 nanocomposites is
studding. The nano composites were processed by different nano
fillers concentrations (0, 0.01, 0.02, 0.03, 0.04, 0.05, 0.07 and
0.1 wt%). Epoxy resin and nanocomposites containing different
shape nano fillers of (MgO:SiO2 composites), are shear mixing with
ratio 1:1,with different nano hybrid fillers concentrations (0.025,
0.05, 0.1, 0.15, 0.2 and 0.25 wt%) to preparation of epoxy/(MgOSiO2)
hybrid nanocomposites. Experimental tests results indicate that
the composite materials have significantly higher modulus of
elasticity than the matrix material but the hybrid nanocomposites
have lower modulus of elasticity. The wear rate was decreased in
nanoc