Cloud computing provides huge amount of area for storage of the data, but with an increase of number of users and size of their data, cloud storage environment faces earnest problem such as saving storage space, managing this large data, security and privacy of data. To save space in cloud storage one of the important methods is data deduplication, it is one of the compression technique that allows only one copy of the data to be saved and eliminate the extra copies. To offer security and privacy of the sensitive data while supporting the deduplication, In this work attacks that exploit the hybrid cloud deduplication have been identified, allowing an attacker to gain access to the files of other users based on very small hash signatures of these files. More specifically, an attacker who knows the hash signature of a file can convince the storage service that he/she owns that file, hence the server lets the attacker to download the entire file. To overcome such attacks,the hash signature is encrypted with the user password. As a proof of concept a prototype of the proposed authorized deduplicate is implemented and conducted the test bed experiments using the prototype. Performance measurements indicate that the proposed Deduplication system incurs minimal overhead in the context of uploading, bandwidth compared to native deduplication.
AA Abbass, HL Hussein, WA Shukur, J Kaabi, R Tornai, Webology, 2022 Individual’s eye recognition is an important issue in applications such as security systems, credit card control and guilty identification. Using video images cause to destroy the limitation of fixed images and to be able to receive users’ image under any condition as well as doing the eye recognition. There are some challenges in these systems; changes of individual gestures, changes of light, face coverage, low quality of video images and changes of personal characteristics in each frame. There is a need for two phases in order to do the eye recognition using images; revelation and eye recognition which will use in the security systems to identify the persons. The mai
... Show MoreIn this paper, a description of a design for new DES block cipher, namely DES64X and DES128X. The goals of this design, a part of its security level, are large implementation flexibility on various operating systems as well as high performances. The high level structure is based on the principle of DES and Feistel schema, and proposes the design of an efficient key-schedule algorithm which will output pseudorandomsequences of subkeys. The main goal is to reach the highest possible flexibility, in terms of round numbers, key size, and block size. A comparison of the proposed systems on 32-bit, 64-bit operating system, using 32-bit and 64-bit Java Virtual Machine (JVM), showed that the latter has much better performance than the former.
... Show MoreTarget tracking is a significant application of wireless sensor networks (WSNs) in which deployment of self-organizing and energy efficient algorithms is required. The tracking accuracy increases as more sensor nodes are activated around the target but more energy is consumed. Thus, in this study, we focus on limiting the number of sensors by forming an ad-hoc network that operates autonomously. This will reduce the energy consumption and prolong the sensor network lifetime. In this paper, we propose a fully distributed algorithm, an Endocrine inspired Sensor Activation Mechanism for multi target-tracking (ESAM) which reflecting the properties of real life sensor activation system based on the information circulating principle in the endocr
... Show MoreCloud Computing is a mass platform to serve high volume data from multi-devices and numerous technologies. Cloud tenants have a high demand to access their data faster without any disruptions. Therefore, cloud providers are struggling to ensure every individual data is secured and always accessible. Hence, an appropriate replication strategy capable of selecting essential data is required in cloud replication environments as the solution. This paper proposed a Crucial File Selection Strategy (CFSS) to address poor response time in a cloud replication environment. A cloud simulator called CloudSim is used to conduct the necessary experiments, and results are presented to evidence the enhancement on replication performance. The obtained an
... Show MoreIn recent years, the migration of the computational workload to computational clouds has attracted intruders to target and exploit cloud networks internally and externally. The investigation of such hazardous network attacks in the cloud network requires comprehensive network forensics methods (NFM) to identify the source of the attack. However, cloud computing lacks NFM to identify the network attacks that affect various cloud resources by disseminating through cloud networks. In this paper, the study is motivated by the need to find the applicability of current (C-NFMs) for cloud networks of the cloud computing. The applicability is evaluated based on strengths, weaknesses, opportunities, and threats (SWOT) to outlook the cloud network. T
... Show MoreThe rapid and enormous growth of the Internet of Things, as well as its widespread adoption, has resulted in the production of massive quantities of data that must be processed and sent to the cloud, but the delay in processing the data and the time it takes to send it to the cloud has resulted in the emergence of fog, a new generation of cloud in which the fog serves as an extension of cloud services at the edge of the network, reducing latency and traffic. The distribution of computational resources to minimize makespan and running costs is one of the disadvantages of fog computing. This paper provides a new approach for improving the task scheduling problem in a Cloud-Fog environme
In this paper, two new simple, fast and efficient block matching algorithms are introduced, both methods begins blocks matching process from the image center block and moves across the blocks toward image boundaries. With each block, its motion vector is initialized using linear prediction that depending on the motion vectors of its neighbor blocks that are already scanned and their motion vectors are assessed. Also, a hybrid mechanism is introduced, it depends on mixing the proposed two predictive mechanisms with Exhaustive Search (ES) mechanism in order to gain matching accuracy near or similar to ES but with Search Time ST less than 80% of the ES. Also, it offers more control capability to reduce the search errors. The experimental tests
... Show More