Big data usually running in large-scale and centralized key management systems. However, the centralized key management systems are increasing the problems such as single point of failure, exchanging a secret key over insecure channels, third-party query, and key escrow problem. To avoid these problems, we propose an improved certificate-based encryption scheme that ensures data confidentiality by combining symmetric and asymmetric cryptography schemes. The combination can be implemented by using the Advanced Encryption Standard (AES) and Elliptic Curve Diffie-Hellman (ECDH). The proposed scheme is an enhanced version of the Certificate-Based Encryption (CBE) scheme and preserves all its advantages. However, the key generation process in our scheme has been done without any intervention from the certificate issuer and avoiding the risk of compromised CA. The Elliptic Curve Digital Signature Algorithm (ECDSA) has been used with the ECDH to handle the authentication of the key exchange. The proposed scheme is demonstrated on a big dataset of social networks. The scheme is analyzed based on security criteria that have been compared with the previous schemes to evaluate its performance.
The purpose of this research is defining the main factors influencing on decision of management system on sensitive data in cloud. The framework is proposed to enhance management information systems decision on sensitive information in cloud environment. The structured interview with several security experts working on cloud computing security to investigate the main objective of framework and suitability of instrument, a pilot study conducts to test the instrument. The validity and reliability test results expose that study can be expanded and lead to final framework validation. This framework using multilevel related to Authorization, Authentication, Classification and identity anonymity, and save and verify, to enhance management
... Show MoreRealizing the full potential of wireless sensor networks (WSNs) highlights many design issues, particularly the trade-offs concerning multiple conflicting improvements such as maximizing the route overlapping for efficient data aggregation and minimizing the total link cost. While the issues of data aggregation routing protocols and link cost function in a WSNs have been comprehensively considered in the literature, a trade-off improvement between these two has not yet been addressed. In this paper, a comprehensive weight for trade-off between different objectives has been employed, the so-called weighted data aggregation routing strategy (WDARS) which aims to maximize the overlap routes for efficient data aggregation and link cost
... Show MoreIn the current study, 2D seismic data in west An-Najaf (WN-36 line) were received after many steps of processing by Oil Exploration Company in 2018. Surface Consistent Amplitude Compensation (SCAC) was applied on the seismic data. The processing sequence in our study started by sorting data in a common mid-point (CMP) gather, in order to apply the velocity analysis using Interactive Velocity Analysis Application (INVA) with Omega system. Semblance of velocity was prepared to preform normal move-out (NMO) vs. Time. Accurate root mean square velocity (VRMS) was selected, which was controlled by flatness of the primary events. The resultant seismic velocity section for the study area shows that the veloci
... Show More The ï€ ï¤
Mixing ratios of ï€ ï§
transitions from low and high spin states populated from the nuclear reaction Ni Mg pn Y 80 39 58 28 ( , ) ï§ are calculated using a new method which we called it as Improved Analysis Method. The comparison of the results of experimental values,CST method, LST and adopted ï€ ï¤ mixing ratios with the results of the presented work confirm the validity of this method.
In this paper the design of hybrid retina matching algorithm that is used in identification systems is considered. Retina based recognition is apparent as the most secure method for identification of an identity utilized to differentiate persons.
The characteristics of Speeded up Robust Feature (SURF) and Binary Robust Invariant Scalable Key-Points (BRISK) algorithm have been used in order to produce a fast matching algorithm than the classical ones, those characteristics are important for real-time applications which usually need quick processing of a growing quantity of data. The algorithm is divided into three stages: retinal image processing and segmentation, extracting the lo
... Show MoreThe data presented in this paper are related to the research article entitled “Novel dichloro(bis{2-[1-(4-methylphenyl)-1H-1,2,3-triazol-4-yl-κN3 ]pyridine-κN})metal(II) coordination compounds of seven transition metals (Mn, Fe, Co, Ni, Cu, Zn and Cd)” (Conradie et al., 2018) [1]. This paper presents characterization and structural data of the 2-(1-(4-methyl-phenyl)-1H-1,2,3-triazol-1-yl)pyridine ligand (L2 ) (Tawfiq et al., 2014) [2] as well as seven dichloro(bis{2- [1-(4-methylphenyl)-1H-1,2,3-triazol-4-yl-κN3 ]pyridine-κN})metal (II) coordination compounds, [M(L2 )2Cl2], all containing the same ligand but coordinated to different metal ions. The data illustrate the shift in IR, UV/VIS, and NMR (for diamagnetic complexes) peaks wh
... Show MoreThe smart city concept has attracted high research attention in recent years within diverse application domains, such as crime suspect identification, border security, transportation, aerospace, and so on. Specific focus has been on increased automation using data driven approaches, while leveraging remote sensing and real-time streaming of heterogenous data from various resources, including unmanned aerial vehicles, surveillance cameras, and low-earth-orbit satellites. One of the core challenges in exploitation of such high temporal data streams, specifically videos, is the trade-off between the quality of video streaming and limited transmission bandwidth. An optimal compromise is needed between video quality and subsequently, rec
... Show MoreThis paper uses Artificial Intelligence (AI) based algorithm analysis to classify breast cancer Deoxyribonucleic (DNA). Main idea is to focus on application of machine and deep learning techniques. Furthermore, a genetic algorithm is used to diagnose gene expression to reduce the number of misclassified cancers. After patients' genetic data are entered, processing operations that require filling the missing values using different techniques are used. The best data for the classification process are chosen by combining each technique using the genetic algorithm and comparing them in terms of accuracy.