Cloud computing provides huge amount of area for storage of the data, but with an increase of number of users and size of their data, cloud storage environment faces earnest problem such as saving storage space, managing this large data, security and privacy of data. To save space in cloud storage one of the important methods is data deduplication, it is one of the compression technique that allows only one copy of the data to be saved and eliminate the extra copies. To offer security and privacy of the sensitive data while supporting the deduplication, In this work attacks that exploit the hybrid cloud deduplication have been identified, allowing an attacker to gain access to the files of other users based on very small hash signatures of these files. More specifically, an attacker who knows the hash signature of a file can convince the storage service that he/she owns that file, hence the server lets the attacker to download the entire file. To overcome such attacks,the hash signature is encrypted with the user password. As a proof of concept a prototype of the proposed authorized deduplicate is implemented and conducted the test bed experiments using the prototype. Performance measurements indicate that the proposed Deduplication system incurs minimal overhead in the context of uploading, bandwidth compared to native deduplication.
Within the framework of big data, energy issues are highly significant. Despite the significance of energy, theoretical studies focusing primarily on the issue of energy within big data analytics in relation to computational intelligent algorithms are scarce. The purpose of this study is to explore the theoretical aspects of energy issues in big data analytics in relation to computational intelligent algorithms since this is critical in exploring the emperica aspects of big data. In this chapter, we present a theoretical study of energy issues related to applications of computational intelligent algorithms in big data analytics. This work highlights that big data analytics using computational intelligent algorithms generates a very high amo
... Show MoreSurvival analysis is widely applied in data describing for the life time of item until the occurrence of an event of interest such as death or another event of understudy . The purpose of this paper is to use the dynamic approach in the deep learning neural network method, where in this method a dynamic neural network that suits the nature of discrete survival data and time varying effect. This neural network is based on the Levenberg-Marquardt (L-M) algorithm in training, and the method is called Proposed Dynamic Artificial Neural Network (PDANN). Then a comparison was made with another method that depends entirely on the Bayes methodology is called Maximum A Posterior (MAP) method. This method was carried out using numerical algorithms re
... Show MoreThis paper deals with the preparation and investigation studies of a number of new complexes of Cu(II) , Zn(II) , Hg(II) , Ag(I) , Pt(IV) and Pb(II).The complexes were formed by the reaction of the mentioned metal ions with the ligand which is derived from oxadiazole (OXB), 2- (2-butyl) thio-5- phenyl – 1,3,4 – oxadiazole in the mole ratio (1:1) , (1:2) and (1:3) (metal to ligand ).The result complexes having general formulae :M(OXB)Cl2] [M(OXB)X2]H2O [ M= Cu(II) , Zn(II) M= Hg(II) , Pb(II) [M(OXB)2 X2] X= Cl– M = Cu (II), Zn (II), Hg (II), Pb (II) X= Cl–, NO3-, CH3COO- [Pt(OXB)3]Cl4 [Ag(OXB)]NO32-(2-??????? ) ???? -5- ???
... Show MoreThe aim of the research is to diagnose and analyze the gap between the actual reality and the application of the eighth requirement (operation) in the National Insurance Company in accordance with the international standard specification for the quality management system (ISO 9001:2015), which is related to the planning, implementation and control of operations, which would raise the level of performance of employees and be reflected in the provision of An appropriate service for the faithful, as the reality of the condition of the requirement was studied by identifying the strengths and weaknesses of the system to diagnose the gap and find ways to address it. A workshop was held with company officials, through which questions were raise
... Show MoreThis paper presents a new algorithm in an important research field which is the semantic word similarity estimation. A new feature-based algorithm is proposed for measuring the word semantic similarity for the Arabic language. It is a highly systematic language where its words exhibit elegant and rigorous logic. The score of sematic similarity between two Arabic words is calculated as a function of their common and total taxonomical features. An Arabic knowledge source is employed for extracting the taxonomical features as a set of all concepts that subsumed the concepts containing the compared words. The previously developed Arabic word benchmark datasets are used for optimizing and evaluating the proposed algorithm. In this paper,
... Show MoreIn this paper , an efficient new procedure is proposed to modify third –order iterative method obtained by Rostom and Fuad [Saeed. R. K. and Khthr. F.W. New third –order iterative method for solving nonlinear equations. J. Appl. Sci .7(2011): 916-921] , using three steps based on Newton equation , finite difference method and linear interpolation. Analysis of convergence is given to show the efficiency and the performance of the new method for solving nonlinear equations. The efficiency of the new method is demonstrated by numerical examples.
Face recognition is a crucial biometric technology used in various security and identification applications. Ensuring accuracy and reliability in facial recognition systems requires robust feature extraction and secure processing methods. This study presents an accurate facial recognition model using a feature extraction approach within a cloud environment. First, the facial images undergo preprocessing, including grayscale conversion, histogram equalization, Viola-Jones face detection, and resizing. Then, features are extracted using a hybrid approach that combines Linear Discriminant Analysis (LDA) and Gray-Level Co-occurrence Matrix (GLCM). The extracted features are encrypted using the Data Encryption Standard (DES) for security
... Show MoreDerivatives of Schiff-bases possess a great importance in pharmaceutical chemistry. They can be used for synthesizing different types of bioactive compounds. In this paper, derivatives of new Schiff bases have been synthesized from several serial steps. The acid (I) was synthesized from the reaction of dichloroethanoic acid with 2 moles of p-aminoacetanilide. New acid (I) converted to its ester (II) via the reaction of (I) with dimethyl sulphate in the present of anhydrous of sodium carbonate and dry acetone. Acid hydrazide (III) has been synthesized by adding 80% of hydrazine hydrate to the new ester using ethanol as a solvent. The last step included the preparation of new Schiff-bases (IV-VIII) by the reaction of acid hydrazide with
... Show More