In recent years, due to the economic benefits and technical advances of cloud
computing, huge amounts of data have been outsourced in the cloud. To protect the
privacy of their sensitive data, data owners have to encrypt their data prior
outsourcing it to the untrusted cloud servers. To facilitate searching over encrypted
data, several approaches have been provided. However, the majority of these
approaches handle Boolean search but not ranked search; a widely accepted
technique in the current information retrieval (IR) systems to retrieve only the top–k
relevant files. In this paper, propose a distributed secure ranked search scheme over
the encrypted cloud servers. Such scheme allows for the authorized user to search
the distributed documents in a descending order with respect to their relevance
documents and the contents of files be retrieved. To do so, each data owner builds
its own searchable index, and associates with each document in that index its weight
score, which facilitate document ranking. To ensure the privacy of these weights,
utilized the ranking. While preserving their capability to perform the ranking process
and to conducted several empirical analyses on a real dataset, all programs needed to
the propose system written using Matlab language.
Nowadays, the advances in information and communication technologies open the wide door to realize the digital world’s dream. Besides, within the clear scientific scope in all fields, especially the medical field, it has become necessary to harness all the scientific capabilities to serve people, especially in medical-related services. The medical images represent the basis of clinical diagnosis and the source of telehealth and teleconsultation processes. The exchange of these images can be subject to several challenges, such as transmission bandwidth, time delivery, fraud, tampering, modifying, privacy, and more. This paper will introduce an algorithm consisting a combination of compression and encryption techniques to meet such chall
... Show MoreDue to the large-scale development in satellite and network communication technologies, there is a significant demand for preserving the secure storage and transmission of the data over the internet and shared network environments. New challenges appeared that are related to the protection of critical and sensitive data
from illegal usage and unauthorized access. In this paper, we address the issues described above and develop new techniques to eliminate the associated problems. To achieve this, we propose a design of a new sensor node for tracking the location of cars and collecting all information and all visited locations by cars, followed by
encryption in a sensor node and saving in the database. A microcontroller of Arduino es
Background: Diabetes and hypertension are related to cardiovascular risk factors and are possible to detect development of atherosclerosis in cardiovascular system, were can predict their effect and measurement by ultrasound and Doppler study. These risk factors included increased intima-media thickness, resistive index (RI) and pulsatility index (PI) of the right common carotid arteries. Method: We studied 20 patients with diabetes and hypertension, and 20 patients with diabetes only, were examine right carotid arteries for these two groups. In this sample we studied the Lumen diameter of the Rt. carotid arteries, Intima – media thickness (IMT), peak systolic velocity, end diastolic velocity, and Pulsatility index, Resistance index were
... Show MoreMany approaches of different complexity already exist to edge detection in
color images. Nevertheless, the question remains of how different are the results
when employing computational costly techniques instead of simple ones. This
paper presents a comparative study on two approaches to color edge detection to
reduce noise in image. The approaches are based on the Sobel operator and the
Laplace operator. Furthermore, an efficient algorithm for implementing the two
operators is presented. The operators have been applied to real images. The results
are presented in this paper. It is shown that the quality of the results increases by
using second derivative operator (Laplace operator). And noise reduced in a good
Characteristic evolving is most serious move that deal with image discrimination. It makes the content of images as ideal as possible. Gaussian blur filter used to eliminate noise and add purity to images. Principal component analysis algorithm is a straightforward and active method to evolve feature vector and to minimize the dimensionality of data set, this paper proposed using the Gaussian blur filter to eliminate noise of images and improve the PCA for feature extraction. The traditional PCA result as total average of recall and precision are (93% ,97%) and for the improved PCA average recall and precision are (98% ,100%), this show that the improved PCA is more effective in recall and precision.
Steganography art is a technique for hiding information where the unsuspicious cover signal carrying the secret information. Good steganography technique must be includes the important criterions robustness, security, imperceptibility and capacity. The improving each one of these criterions is affects on the others, because of these criterions are overlapped each other. In this work, a good high capacity audio steganography safely method has been proposed based on LSB random replacing of encrypted cover with encrypted message bits at random positions. The research also included a capacity studying for the audio file, speech or music, by safely manner to carrying secret images, so it is difficult for unauthorized persons to suspect
... Show MoreThe proposed algorithm that is presented in this paper is based on using the principle of texts translation from one language to another, but I will develop this meaning to cipher texts by using any electronic dictionary as a tool of ciphering based on the locations of the words that text contained them in the dictionary. Then convert the text file into picture file, such as BMP-24 format. The picture file will be transmitted to the receiver. The same algorithm will be used in encryption and decryption processing in forward direction in the sender, and in backward direction in the receiver. Visual Basic 6.0 is used to implement the proposed cryptography algorithm.
Digital forensic is part of forensic science that implicitly covers crime related to computer and other digital devices. It‟s being for a while that academic studies are interested in digital forensics. The researchers aim to find out a discipline based on scientific structures that defines a model reflecting their observations. This paper suggests a model to improve the whole investigation process and obtaining an accurate and complete evidence and adopts securing the digital evidence by cryptography algorithms presenting a reliable evidence in a court of law. This paper presents the main and basic concepts of the frameworks and models used in digital forensics investigation.
Currently no one can deny the importance of data protection, especially with the proliferation of hackers and theft of personal information in all parts of the world .for these reasons the encryption has become one of the important fields in the protection of digital information.
This paper adopts a new image encryption method to overcome the obstacles to previous image encryption methods, where our method will be used Duffing map to shuffled all image pixels ,after that the resulting image will be divided into a group of blocks for perform the shuffling process via Cross Chaotic Map.
Finally, an image called key image was created by using Quadratic number spirals which will be used to generate nu
Engineering geological study of rock slope stability in two stations lying in the SW of Haibat Sultan mountain, along the Kalksmaq - Koisanjaq road was carried out. At each station, rock slopes and discontinuities were comprehensively surveyed and the relationships with failures were determined. The limestone rock was described in engineering terms, the types of failures recorded during field study were rock roll and toppling whiles the probable failures were sliding, toppling, and rock roll. The study also revealed that the factors affecting slope stability in the study area were slope angle, height, dip of strata, and discontinuities (which are almost perpendicular to the bedding plane). The laboratory test, of the rock samples (point-
... Show More