Big data usually running in large-scale and centralized key management systems. However, the centralized key management systems are increasing the problems such as single point of failure, exchanging a secret key over insecure channels, third-party query, and key escrow problem. To avoid these problems, we propose an improved certificate-based encryption scheme that ensures data confidentiality by combining symmetric and asymmetric cryptography schemes. The combination can be implemented by using the Advanced Encryption Standard (AES) and Elliptic Curve Diffie-Hellman (ECDH). The proposed scheme is an enhanced version of the Certificate-Based Encryption (CBE) scheme and preserves all its advantages. However, the key generation process in our scheme has been done without any intervention from the certificate issuer and avoiding the risk of compromised CA. The Elliptic Curve Digital Signature Algorithm (ECDSA) has been used with the ECDH to handle the authentication of the key exchange. The proposed scheme is demonstrated on a big dataset of social networks. The scheme is analyzed based on security criteria that have been compared with the previous schemes to evaluate its performance.
Poverty phenomenon is very substantial topic that determines the future of societies and governments and the way that they deals with education, health and economy. Sometimes poverty takes multidimensional trends through education and health. The research aims at studying multidimensional poverty in Iraq by using panelized regression methods, to analyze Big Data sets from demographical surveys collected by the Central Statistical Organization in Iraq. We choose classical penalized regression method represented by The Ridge Regression, Moreover; we choose another penalized method which is the Smooth Integration of Counting and Absolute Deviation (SICA) to analyze Big Data sets related to the different poverty forms in Iraq. Euclidian Distanc
... Show MoreElliptic Curve Cryptography (ECC) is one of the public key cryptosystems that works based on the algebraic models in the form of elliptic curves. Usually, in ECC to implement the encryption, the encoding of data must be carried out on the elliptic curve, which seems to be a preprocessing step. Similarly, after the decryption a post processing step must be conducted for mapping or decoding the corresponding data to the exact point on the elliptic curves. The Memory Mapping (MM) and Koblitz Encoding (KE) are the commonly used encoding models. But both encoding models have drawbacks as the MM needs more memory for processing and the KE needs more computational resources. To overcome these issues the proposed enhanced Koblitz encodi
... Show MoreAmong the metaheuristic algorithms, population-based algorithms are an explorative search algorithm superior to the local search algorithm in terms of exploring the search space to find globally optimal solutions. However, the primary downside of such algorithms is their low exploitative capability, which prevents the expansion of the search space neighborhood for more optimal solutions. The firefly algorithm (FA) is a population-based algorithm that has been widely used in clustering problems. However, FA is limited in terms of its premature convergence when no neighborhood search strategies are employed to improve the quality of clustering solutions in the neighborhood region and exploring the global regions in the search space. On the
... Show MoreAccurate predictive tools for VLE calculation are always needed. A new method is introduced for VLE calculation which is very simple to apply with very good results compared with previously used methods. It does not need any physical property except each binary system need tow constants only. Also, this method can be applied to calculate VLE data for any binary system at any polarity or from any group family. But the system binary should not confirm an azeotrope. This new method is expanding in application to cover a range of temperature. This expansion does not need anything except the application of the new proposed form with the system of two constants. This method with its development is applied to 56 binary mixtures with 1120 equili
... Show MoreThe current paper proposes a new estimator for the linear regression model parameters under Big Data circumstances. From the diversity of Big Data variables comes many challenges that can be interesting to the researchers who try their best to find new and novel methods to estimate the parameters of linear regression model. Data has been collected by Central Statistical Organization IRAQ, and the child labor in Iraq has been chosen as data. Child labor is the most vital phenomena that both society and education are suffering from and it affects the future of our next generation. Two methods have been selected to estimate the parameter
... Show MoreSecure storage of confidential medical information is critical to healthcare organizations seeking to protect patient's privacy and comply with regulatory requirements. This paper presents a new scheme for secure storage of medical data using Chaskey cryptography and blockchain technology. The system uses Chaskey encryption to ensure integrity and confidentiality of medical data, blockchain technology to provide a scalable and decentralized storage solution. The system also uses Bflow segmentation and vertical segmentation technologies to enhance scalability and manage the stored data. In addition, the system uses smart contracts to enforce access control policies and other security measures. The description of the system detailing and p
... Show MoreSocial media and networks rely heavily on images. Those images should be distributed in a private manner. Image encryption is therefore one of the most crucial components of cyber security. In the present study, an effective image encryption technique is developed that combines the Rabbit Algorithm, a simple algorithm, with the Attractor of Aizawa, a chaotic map. The lightweight encryption algorithm (Rabbit Algorithm), which is a 3D dynamic system, is made more secure by the Attractor of Aizawa. The process separates color images into blocks by first dividing them into bands of red, green, and blue (RGB). The presented approach generates multiple keys, or sequences, based on the initial parameters and conditions, which are
... Show MoreSpeech encryption approaches are used to prevent eavesdropping, tracking, and other security concerns in speech communication. In this paper, a new cryptography algorithm is proposed to encrypt digital speech files. Initially, the digital speech files are rearranged as a cubic model with six sides to scatter speech data. Furthermore, each side is encrypted by random keys that are created by using two chaotic maps (Hénon and Gingerbread chaotic maps). Encryption for each side of the cube is achieved, using the based map vector that is generated randomly by using a simple random function. Map vector that consists of six bits, each bit refers to one of the specific chaotic maps that generate a random key to encrypt each face of the cube. R
... Show More