Cloud computing provides huge amount of area for storage of the data, but with an increase of number of users and size of their data, cloud storage environment faces earnest problem such as saving storage space, managing this large data, security and privacy of data. To save space in cloud storage one of the important methods is data deduplication, it is one of the compression technique that allows only one copy of the data to be saved and eliminate the extra copies. To offer security and privacy of the sensitive data while supporting the deduplication, In this work attacks that exploit the hybrid cloud deduplication have been identified, allowing an attacker to gain access to the files of other users based on very small hash signatures of these files. More specifically, an attacker who knows the hash signature of a file can convince the storage service that he/she owns that file, hence the server lets the attacker to download the entire file. To overcome such attacks,the hash signature is encrypted with the user password. As a proof of concept a prototype of the proposed authorized deduplicate is implemented and conducted the test bed experiments using the prototype. Performance measurements indicate that the proposed Deduplication system incurs minimal overhead in the context of uploading, bandwidth compared to native deduplication.
A simple and highly sensitive cloud point extraction process was suggested for preconcentration of micrograms amount of isoxsuprine hydrochloride (ISX) in pure and pharmaceutical samples. After diazotization coupling of ISX with diazotized sulfadimidine in alkaline medium, the azo-dye product quantitatively extracted into the Triton X-114 rich phase, dissolved in ethanol and determined spectrophotometrically at 490 nm. The suggested reaction was studied with and without extraction and simple comparison between the batch and CPE methods was achieved. Analytical variables including concentrations of reagent, Triton X-114 and base, incubated temperature, and time were carefully studied. Under the selected optimum conditions,
... Show MoreIn this study, the dynamic modeling and step input tracking control of single flexible link is studied. The Lagrange-assumed modes approach is applied to get the dynamic model of a planner single link manipulator. A Step input tracking controller is suggested by utilizing the hybrid controller approach to overcome the problem of vibration of tip position through motion which is a characteristic of the flexible link system. The first controller is a modified version of the proportional-derivative (PD) rigid controller to track the hub position while sliding mode (SM) control is used for vibration damping. Also, a second controller (a fuzzy logic based proportional-integral plus derivative (PI+D) control scheme) is developed for both vibra
... Show MoreHuge number of medical images are generated and needs for more storage capacity and bandwidth for transferring over the networks. Hybrid DWT-DCT compression algorithm is applied to compress the medical images by exploiting the features of both techniques. Discrete Wavelet Transform (DWT) coding is applied to image YCbCr color model which decompose image bands into four subbands (LL, HL, LH and HH). The LL subband is transformed into low and high frequency components using Discrete Cosine Transform (DCT) to be quantize by scalar quantization that was applied on all image bands, the quantization parameters where reduced by half for the luminance band while it is the same for the chrominance bands to preserve the image quality, the zig
... Show MoreNowadays, cloud computing has attracted the attention of large companies due to its high potential, flexibility, and profitability in providing multi-sources of hardware and software to serve the connected users. Given the scale of modern data centers and the dynamic nature of their resource provisioning, we need effective scheduling techniques to manage these resources while satisfying both the cloud providers and cloud users goals. Task scheduling in cloud computing is considered as NP-hard problem which cannot be easily solved by classical optimization methods. Thus, both heuristic and meta-heuristic techniques have been utilized to provide optimal or near-optimal solutions within an acceptable time frame for such problems. In th
... Show More
The mechanism of managing religious difference
God is the Lord of the worlds, forget them and their jinn, Arabs and non-Muslims, He is the Lord of Muslims and Lord of non-Muslims, as He created them male and female despite their differences in tongues and colors, so He created them according to their diversity and distinction in beliefs and religions.
To prevent flare-ups due to differences, the Lord of the worlds set limits that he has forbidden to cross, and draw clear maps as mechanisms for managing religious differences and lifting psychological barriers between the different, so that they can coexist in peace and freedom, each adhering to his faith, and practicing the rituals of his religion.
... Show MoreConsistent "with the thought of tax talk is unified tax natural evolution for him, as the application leads to the inclusion of tax all branches of income and its sources and through truncated part of this entry through the application of price ascending it, it means the procedures of tax reform. Taxes on total income characterized by giving a clear picture of the total income of the taxpayer and its financial situation and its burden family which allows granting exemptions, downloads, and application of prices that fit this case. This requires reconsideration of the structure of the tax system in force and the transition from a system specific taxes to the tax system on the total income of the integration of income from the rental of re
... Show MoreToday, problems of spatial data integration have been further complicated by the rapid development in communication technologies and the increasing amount of available data sources on the World Wide Web. Thus, web-based geospatial data sources can be managed by different communities and the data themselves can vary in respect to quality, coverage, and purpose. Integrating such multiple geospatial datasets remains a challenge for geospatial data consumers. This paper concentrates on the integration of geometric and classification schemes for official data, such as Ordnance Survey (OS) national mapping data, with volunteered geographic information (VGI) data, such as the data derived from the OpenStreetMap (OSM) project. Useful descriptions o
... Show MoreBig data analysis is essential for modern applications in areas such as healthcare, assistive technology, intelligent transportation, environment and climate monitoring. Traditional algorithms in data mining and machine learning do not scale well with data size. Mining and learning from big data need time and memory efficient techniques, albeit the cost of possible loss in accuracy. We have developed a data aggregation structure to summarize data with large number of instances and data generated from multiple data sources. Data are aggregated at multiple resolutions and resolution provides a trade-off between efficiency and accuracy. The structure is built once, updated incrementally, and serves as a common data input for multiple mining an
... Show More