A content-based image retrieval (CBIR) is a technique used to retrieve images from an image database. However, the CBIR process suffers from less accuracy to retrieve images from an extensive image database and ensure the privacy of images. This paper aims to address the issues of accuracy utilizing deep learning techniques as the CNN method. Also, it provides the necessary privacy for images using fully homomorphic encryption methods by Cheon, Kim, Kim, and Song (CKKS). To achieve these aims, a system has been proposed, namely RCNN_CKKS, that includes two parts. The first part (offline processing) extracts automated high-level features based on a flatting layer in a convolutional neural network (CNN) and then stores these features in a new dataset. In the second part (online processing), the client sends the encrypted image to the server, which depends on the CNN model trained to extract features of the sent image. Next, the extracted features are compared with the stored features using a Hamming distance method to retrieve all similar images. Finally, the server encrypts all retrieved images and sends them to the client. Deep-learning results on plain images were 97.94% for classification and 98.94% for retriever images. At the same time, the NIST test was used to check the security of CKKS when applied to Canadian Institute for Advanced Research (CIFAR-10) dataset. Through these results, researchers conclude that deep learning is an effective method for image retrieval and that a CKKS method is appropriate for image privacy protection.
In this paper, a compression system with high synthetic architect is introduced, it is based on wavelet transform, polynomial representation and quadtree coding. The bio-orthogonal (tap 9/7) wavelet transform is used to decompose the image signal, and 2D polynomial representation is utilized to prune the existing high scale variation of image signal. Quantization with quadtree coding are followed by shift coding are applied to compress the detail band and the residue part of approximation subband. The test results indicate that the introduced system is simple and fast and it leads to better compression gain in comparison with the case of using first order polynomial approximation.
My research to study the processes of the creation of shapes and encrypt any encryption in design forms and contents of computer technology as the creative property of definable and renewal, change and transformation process of transformative theme of shape, form and content encryption process in textile designs lets us know the meaning or substance which may be invisible to the encryption in the digital design of fabrics is a recruitment ideas modern and refined through a technique to accomplish the work of a beautiful audiences with novelty and innovation. The search includes four chapters:1Chapter I deal with the problem of research and its current research (form and content encryption with digital designs in women's contemporary fabr
... Show MoreJoining tissue is a growing problem in surgery with the advancement of the technology and more precise and difficult surgeries are done. Tissue welding using laser is a promising technique that might help in more advancement of the surgical practice. Objectives: To study the ability of laser in joining tissues and the optimum parameters for yielding good welding of tissues. Methods: An in-vitro study, done at the Institute of Laser, Baghdad University during the period from October 2008 to February 2009. Diode and Nd-YAG lasers were applied, using different sessions, on sheep small intestine with or without solder to obtain welding of a 2-mm length full thickness incision. Different powers and energies were used to get maximum effect. Re
... Show MoreCryptographic applications demand much more of a pseudo-random-sequence
generator than do most other applications. Cryptographic randomness does not mean just
statistical randomness, although that is part of it. For a sequence to be cryptographically
secure pseudo-random, it must be unpredictable.
The random sequences should satisfy the basic randomness postulates; one of them is
the run postulate (sequences of the same bit). These sequences should have about the same
number of ones and zeros, about half the runs should be of length one, one quarter of length
two, one eighth of length three, and so on.The distribution of run lengths for zeros and ones
should be the same. These properties can be measured determinis
This study investigated the prevalence of quinolones resistance proteins encoding genes (qnr genes) and co-resistance for fluoroquinolones and β-lactams among clinical isolates of Klebsiella pneumoniae. Out of 150 clinical samples, 50 isolates of K. pneumoniae were identified according to morphological and biochemical properties. These isolates were collected from different clinical samples, including 15 (30%) urine, 12 (24%) blood, 9 (18%) sputum, 9 (18%) wound, and 5 (10%) burn. The minimum inhibitory concentrations (MICs) assay revealed that 15 (30%) of isolates were resistant to ciprofloxacin (≥4µg/ml), 11 (22%) of isolates were resistant to levofloxacin (≥8 µg/ml), 21 (42%) of isolates were re
... Show MoreThe physical sports sector in Iraq suffers from the problem of achieving sports achievements in individual and team games in various Asian and international competitions, for many reasons, including the lack of exploitation of modern, accurate and flexible technologies and means, especially in the field of information technology, especially the technology of artificial neural networks. The main goal of this study is to build an intelligent mathematical model to predict sport achievement in pole vaulting for men, the methodology of the research included the use of five variables as inputs to the neural network, which are Avarage of Speed (m/sec in Before distance 05 meters latest and Distance 05 meters latest, The maximum speed achieved in t
... Show MoreIn this paper, a fast lossless image compression method is introduced for compressing medical images, it is based on splitting the image blocks according to its nature along with using the polynomial approximation to decompose image signal followed by applying run length coding on the residue part of the image, which represents the error caused by applying polynomial approximation. Then, Huffman coding is applied as a last stage to encode the polynomial coefficients and run length coding. The test results indicate that the suggested method can lead to promising performance.
Compressing the speech reduces the data storage requirements, leading to reducing the time of transmitting the digitized speech over long-haul links like internet. To obtain best performance in speech compression, wavelet transforms require filters that combine a number of desirable properties, such as orthogonality and symmetry.The MCT bases functions are derived from GHM bases function using 2D linear convolution .The fast computation algorithm methods introduced here added desirable features to the current transform. We further assess the performance of the MCT in speech compression application. This paper discusses the effect of using DWT and MCT (one and two dimension) on speech compression. DWT and MCT performances in terms of comp
... Show MoreThe important device in the Wireless Sensor Network (WSN) is the Sink Node (SN). That is used to store, collect and analyze data from every sensor node in the network. Thus the main role of SN in WSN makes it a big target for traffic analysis attack. Therefore, securing the SN position is a substantial issue. This study presents Security for Mobile Sink Node location using Dynamic Routing Protocol called (SMSNDRP), in order to increase complexity for adversary trying to discover mobile SN location. In addition to that, it minimizes network energy consumption. The proposed protocol which is applied on WSN framework consists of 50 nodes with static and mobile SN. The results havw shown in each round a dynamic change in the route to reach mobi
... Show MoreThis paper includes a comparison between denoising techniques by using statistical approach, principal component analysis with local pixel grouping (PCA-LPG), this procedure is iterated second time to further improve the denoising performance, and other enhancement filters were used. Like adaptive Wiener low pass-filter to a grayscale image that has been degraded by constant power additive noise, based on statistics estimated from a local neighborhood of each pixel. Performs Median filter of the input noisy image, each output pixel contains the Median value in the M-by-N neighborhood around the corresponding pixel in the input image, Gaussian low pass-filter and Order-statistic filter also be used. Experimental results shows LPG-PCA method
... Show More