A content-based image retrieval (CBIR) is a technique used to retrieve images from an image database. However, the CBIR process suffers from less accuracy to retrieve images from an extensive image database and ensure the privacy of images. This paper aims to address the issues of accuracy utilizing deep learning techniques as the CNN method. Also, it provides the necessary privacy for images using fully homomorphic encryption methods by Cheon, Kim, Kim, and Song (CKKS). To achieve these aims, a system has been proposed, namely RCNN_CKKS, that includes two parts. The first part (offline processing) extracts automated high-level features based on a flatting layer in a convolutional neural network (CNN) and then stores these features in a new dataset. In the second part (online processing), the client sends the encrypted image to the server, which depends on the CNN model trained to extract features of the sent image. Next, the extracted features are compared with the stored features using a Hamming distance method to retrieve all similar images. Finally, the server encrypts all retrieved images and sends them to the client. Deep-learning results on plain images were 97.94% for classification and 98.94% for retriever images. At the same time, the NIST test was used to check the security of CKKS when applied to Canadian Institute for Advanced Research (CIFAR-10) dataset. Through these results, researchers conclude that deep learning is an effective method for image retrieval and that a CKKS method is appropriate for image privacy protection.
The historical center's landscape suffers from neglect, despite their importance and broad capabilities in enhancing the cultural value of the historical center, as landscape includes many heterogeneous human and non-human components, material and immaterial, natural and manufactured, also different historical layers, ancient, modern and contemporary. Due to the difference in these components and layers, it has become difficult for the designer to deal with it. Therefore, the research was directed by following a methodology of actor-network theory as it deals with such a complex system and concerned with an advanced method to connect the various components of considering landscape as a ground that can include various elements and deal wi
... Show MoreRecently, the phenomenon of the spread of fake news or misinformation in most fields has taken on a wide resonance in societies. Combating this phenomenon and detecting misleading information manually is rather boring, takes a long time, and impractical. It is therefore necessary to rely on the fields of artificial intelligence to solve this problem. As such, this study aims to use deep learning techniques to detect Arabic fake news based on Arabic dataset called the AraNews dataset. This dataset contains news articles covering multiple fields such as politics, economy, culture, sports and others. A Hybrid Deep Neural Network has been proposed to improve accuracy. This network focuses on the properties of both the Text-Convolution Neural
... Show MoreIn this paper, first we refom1Ulated the finite element model
(FEM) into a neural network structure using a simple two - dimensional problem. The structure of this neural network is described
, followed by its application to solving the forward and inverse problems. This model is then extended to the general case and the advantages and di sadvantages of this approach are descri bed along with an analysis of the sensi tivity of
... Show MoreAs s widely use of exchanging private information in various communication applications, the issue to secure it became top urgent. In this research, a new approach to encrypt text message based on genetic algorithm operators has been proposed. The proposed approach follows a new algorithm of generating 8 bit chromosome to encrypt plain text after selecting randomly crossover point. The resulted child code is flipped by one bit using mutation operation. Two simulations are conducted to evaluate the performance of the proposed approach including execution time of encryption/decryption and throughput computations. Simulations results prove the robustness of the proposed approach to produce better performance for all evaluation metrics with res
... Show MoreThe time spent in drilling ahead is usually a significant portion of total well cost. Drilling is an expensive operation including the cost of equipment and material used during the penetration of rock plus crew efforts in order to finish the well without serious problems. Knowing the rate of penetration should help in speculation of the cost and lead to optimize drilling outgoings. Ten wells in the Nasiriya oil field have been selected based on the availability of the data. Dynamic elastic properties of Mishrif formation in the selected wells were determined by using Interactive Petrophysics (IP V3.5) software based on the las files and log record provided. The average rate of penetration and average dynamic elastic propert
... Show MoreThe increase globally fossil fuel consumption as it represents the main source of energy around the world, and the sources of heavy oil more than light, different techniques were used to reduce the viscosity and increase mobility of heavy crude oil. this study focusing on the experimental tests and modeling with Back Feed Forward Artificial Neural Network (BFF-ANN) of the dilution technique to reduce a heavy oil viscosity that was collected from the south- Iraq oil fields using organic solvents, organic diluents with different weight percentage (5, 10 and 20 wt.% ) of (n-heptane, toluene, and a mixture of different ratio
... Show MoreDatabase is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show MoreBackground/Objectives: The purpose of this study was to classify Alzheimer’s disease (AD) patients from Normal Control (NC) patients using Magnetic Resonance Imaging (MRI). Methods/Statistical analysis: The performance evolution is carried out for 346 MR images from Alzheimer's Neuroimaging Initiative (ADNI) dataset. The classifier Deep Belief Network (DBN) is used for the function of classification. The network is trained using a sample training set, and the weights produced are then used to check the system's recognition capability. Findings: As a result, this paper presented a novel method of automated classification system for AD determination. The suggested method offers good performance of the experiments carried out show that the
... Show MoreIn this golden age of rapid development surgeons realized that AI could contribute to healthcare in all aspects, especially in surgery. The aim of the study will incorporate the use of Convolutional Neural Network and Constrained Local Models (CNN-CLM) which can make improvement for the assessment of Laparoscopic Cholecystectomy (LC) surgery not only bring opportunities for surgery but also bring challenges on the way forward by using the edge cutting technology. The problem with the current method of surgery is the lack of safety and specific complications and problems associated with safety in each laparoscopic cholecystectomy procedure. When CLM is utilize into CNN models, it is effective at predicting time series tasks like iden
... Show More