Cloud storage provides scalable and low cost resources featuring economies of scale based on cross-user architecture. As the amount of data outsourced grows explosively, data deduplication, a technique that eliminates data redundancy, becomes essential. The most important cloud service is data storage. In order to protect the privacy of data owner, data are stored in cloud in an encrypted form. However, encrypted data introduce new challenges for cloud data deduplication, which becomes crucial for data storage. Traditional deduplication schemes cannot work on encrypted data. Existing solutions of encrypted data deduplication suffer from security weakness. This paper proposes a combined compressive sensing and video deduplication to maximize deduplication ratios. Our approach uses data deduplication to remove identical copies of the video. Our experimental results show significant storage savings, while providing strong level security
Cryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms to
... Show MoreBig data analysis has important applications in many areas such as sensor networks and connected healthcare. High volume and velocity of big data bring many challenges to data analysis. One possible solution is to summarize the data and provides a manageable data structure to hold a scalable summarization of data for efficient and effective analysis. This research extends our previous work on developing an effective technique to create, organize, access, and maintain summarization of big data and develops algorithms for Bayes classification and entropy discretization of large data sets using the multi-resolution data summarization structure. Bayes classification and data discretization play essential roles in many learning algorithms such a
... Show MoreAn analytical approach based on field data was used to determine the strength capacity of large diameter bored type piles. Also the deformations and settlements were evaluated for both vertical and lateral loadings. The analytical predictions are compared to field data obtained from a proto-type test pile used at Tharthar –Tigris canal Bridge. They were found to be with acceptable agreement of 12% deviation.
Following ASTM standards D1143M-07e1,2010, a test schedule of five loading cycles were proposed for vertical loads and series of cyclic loads to simulate horizontal loading .The load test results and analytical data of 1.95
... Show MoreA two time step stochastic multi-variables multi-sites hydrological data forecasting model was developed and verified using a case study. The philosophy of this model is to use the cross-variables correlations, cross-sites correlations and the two steps time lag correlations simultaneously, for estimating the parameters of the model which then are modified using the mutation process of the genetic algorithm optimization model. The objective function that to be minimized is the Akiake test value. The case study is of four variables and three sites. The variables are the monthly air temperature, humidity, precipitation, and evaporation; the sites are Sulaimania, Chwarta, and Penjwin, which are located north Iraq. The model performance was
... Show MoreKirchhoff Time Migration method was applied in pre-and post-Stack Time Migration for post-processing of images collected from Balad-Samarra (BS-92) survey line that is sited across Ajeel anticline oilfield. The results showed that Ajeel anticline structure was relocated at the correct position in the migrated stacked section. The two methods (Pre and Post) of migration processing showed enhanced subsurface images and increased horizontal resolution, which was clear after the broadening the syncline and narrowing or compressing the anticline. However, each of these methods was associated with migration noise. Thus, a Post-Stack process was applied using Dip-Removal (DDMED) and Band-Pass filters to eliminate the artifact noise. The time-fr
... Show MoreThis paper is carried out to detect the subsurface structures that have geological
and economical importance by interpreting the available reflection seismic data of
an area estimated to be about (740) km2. The Khashim Al-Ahmer structure is partial
of series structures of (Injana – Khashim Al-Ahmer – Mannsorya) from the (NW to
the SE), it is located within for deep faulted area. The component of the one
elongated dome of asymmetrical of structure which has(SW) limb more steeper than
the (NE) limb.Twenty three seismic sections had been interpreted for two seismic
surveys and the total length of all seismic lines is about (414.7) Km. Interpretation
of seismic data was focused on two reflectors (Fatha and Jeribi)
Cryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms
... Show MoreMachine learning has a significant advantage for many difficulties in the oil and gas industry, especially when it comes to resolving complex challenges in reservoir characterization. Permeability is one of the most difficult petrophysical parameters to predict using conventional logging techniques. Clarifications of the work flow methodology are presented alongside comprehensive models in this study. The purpose of this study is to provide a more robust technique for predicting permeability; previous studies on the Bazirgan field have attempted to do so, but their estimates have been vague, and the methods they give are obsolete and do not make any concessions to the real or rigid in order to solve the permeability computation. To
... Show MorePseudomonas aeruginosa is an opportunistic pathogen. Quorum sensing (QS) is one of processes that are responsible for biofilm formation. P. aeruginosa can live in different environments, some of which are pathogenic (clinical isolates) and some that are found outside the body (environmental isolates). The present study aimed to determine the presence of a number of genes responsible for QS in clinical and environmental isolates of P. aeruginosa. In the present study full DNA was separated from all environmental and clinical isolates that contained seven genes (rhlA, rhlR, rhlI, lasR, lasI, lasB, phzA1) associated with QS occurrence. The tot
... Show More