Cloud computing provides huge amount of area for storage of the data, but with an increase of number of users and size of their data, cloud storage environment faces earnest problem such as saving storage space, managing this large data, security and privacy of data. To save space in cloud storage one of the important methods is data deduplication, it is one of the compression technique that allows only one copy of the data to be saved and eliminate the extra copies. To offer security and privacy of the sensitive data while supporting the deduplication, In this work attacks that exploit the hybrid cloud deduplication have been identified, allowing an attacker to gain access to the files of other users based on very small hash signatures of these files. More specifically, an attacker who knows the hash signature of a file can convince the storage service that he/she owns that file, hence the server lets the attacker to download the entire file. To overcome such attacks,the hash signature is encrypted with the user password. As a proof of concept a prototype of the proposed authorized deduplicate is implemented and conducted the test bed experiments using the prototype. Performance measurements indicate that the proposed Deduplication system incurs minimal overhead in the context of uploading, bandwidth compared to native deduplication.
Big data of different types, such as texts and images, are rapidly generated from the internet and other applications. Dealing with this data using traditional methods is not practical since it is available in various sizes, types, and processing speed requirements. Therefore, data analytics has become an important tool because only meaningful information is analyzed and extracted, which makes it essential for big data applications to analyze and extract useful information. This paper presents several innovative methods that use data analytics techniques to improve the analysis process and data management. Furthermore, this paper discusses how the revolution of data analytics based on artificial intelligence algorithms might provide
... Show MoreThe main problem of this research is the delay of implementation of the investment plan projects for the period (2013-2016) and the weakness of the staff ability in the ministries and they don’t have the sufficient experience to carry out the implementation process.
Therefore, the research aims to evaluate the implementation of programs and projects of the investment plan in a manner consistent with the objectives set for them without any wasteful of efforts, time and money. And then identify the problems and obstacles to determine the deviations of the implementation of the specific for each sector according to the criteria of evaluation and the form of cost, quality, time and implementation.<
... Show MoreThis paper deals to how to estimate points non measured spatial data when the number of its terms (sample spatial) a few, that are not preferred for the estimation process, because we also know that whenever if the data is large, the estimation results of the points non measured to be better and thus the variance estimate less, so the idea of this paper is how to take advantage of the data other secondary (auxiliary), which have a strong correlation with the primary data (basic) to be estimated single points of non-measured, as well as measuring the variance estimate, has been the use of technique Co-kriging in this field to build predictions spatial estimation process, and then we applied this idea to real data in th
... Show MoreSix proposed simply supported high strength-steel fiber reinforced concrete (HS-SFRC) beams reinforced with FRP (fiber reinforced polymer) rebars were numerically tested by finite element method using ABAQUS software to investigate their behavior under the flexural failure. The beams were divided into two groups depending on their cross sectional shape. Group A consisted of four trapezoidal beams with dimensions of (height 200 mm, top width 250 mm, and bottom width 125 mm), while group B consisted of two rectangular beams with dimensions of (125 ×200) mm. All specimens have same total length of 1500 mm, and they were also considered to be made of same high strength concrete designed material with 1% volume fraction of steel fiber.
... Show MoreData compression offers an attractive approach to reducing communication costs using available bandwidth effectively. It makes sense to pursue research on developing algorithms that can most effectively use available network. It is also important to consider the security aspect of the data being transmitted is vulnerable to attacks. The basic aim of this work is to develop a module for combining the operation of compression and encryption on the same set of data to perform these two operations simultaneously. This is achieved through embedding encryption into compression algorithms since both cryptographic ciphers and entropy coders bear certain resemblance in the sense of secrecy. First in the secure compression module, the given text is p
... Show MoreIn this study, we review the ARIMA (p, d, q), the EWMA and the DLM (dynamic linear moodelling) procedures in brief in order to accomdate the ac(autocorrelation) structure of data .We consider the recursive estimation and prediction algorithms based on Bayes and KF (Kalman filtering) techniques for correlated observations.We investigate the effect on the MSE of these procedures and compare them using generated data.