Big data usually running in large-scale and centralized key management systems. However, the centralized key management systems are increasing the problems such as single point of failure, exchanging a secret key over insecure channels, third-party query, and key escrow problem. To avoid these problems, we propose an improved certificate-based encryption scheme that ensures data confidentiality by combining symmetric and asymmetric cryptography schemes. The combination can be implemented by using the Advanced Encryption Standard (AES) and Elliptic Curve Diffie-Hellman (ECDH). The proposed scheme is an enhanced version of the Certificate-Based Encryption (CBE) scheme and preserves all its advantages. However, the key generation process in our scheme has been done without any intervention from the certificate issuer and avoiding the risk of compromised CA. The Elliptic Curve Digital Signature Algorithm (ECDSA) has been used with the ECDH to handle the authentication of the key exchange. The proposed scheme is demonstrated on a big dataset of social networks. The scheme is analyzed based on security criteria that have been compared with the previous schemes to evaluate its performance.
The reaction of LAs-Cl8 : [ (2,2- (1-(3,4-bis(carboxylicdichloromethoxy)-5-oxo-2,5- dihydrofuran-2-yl)ethane – 1,2-diyl)bis(2,2-dichloroacetic acid)]with sodium azide in ethanol with drops of distilled water has been investigated . The new product L-AZ :(3Z ,5Z,8Z)-2- azido-8-[azido(3Z,5Z)-2-azido-2,6-bis(azidocarbonyl)-8,9-dihydro-2H-1,7-dioxa-3,4,5- triazonine-9-yl]methyl]-9-[(1-azido-1-hydroxy)methyl]-2H-1,7-dioxa-3,4,5-triazonine – 2,6 – dicarbonylazide was isolated and characterized by elemental analysis (C.H.N) , 1H-NMR , Mass spectrum and Fourier transform infrared spectrophotometer (FT-IR) . The reaction of the L-AZ withM+n: [ ( VO(II) , Cr(III) ,Mn(II) , Co(II) , Ni(II) , Cu(II) , Zn(II) , Cd(II) and Hg(II)] has been i
... Show MoreA simple and novel membraneless paper-based microfluidic fuel cell was presented in this study. The occurrence of laminar flow was employed to ensure no mixing of the fuel and oxidant fluids along the bath of reaction. The acidic wastewater was used as a fuel. It was an air-breathing cell, so air and tab water were used as oxidants. Both the fuel and tab water flowed continuously under gravity. Whatman filter paper was used for preparation of the fuel cell channel and two carbon fibre electrodes were used and firmed on the edges of the cell. The performance of the cell was examined over three consecutive days. The results indicated that the present cell has the potential to generate electric power, but an extensive study is required to harv
... Show MoreThe effects of using aqueous nanofluids containing covalently functionalized graphene nanoplatelets with triethanolamine (TEA-GNPs) as novel working fluids on the thermal performance of a flat-plate solar collector (FPSC) have been investigated. Water-based nanofluids with weight concentrations of 0.025%, 0.05%, 0.075%, and 0.1% of TEA-GNPs with specific surface areas of 300, 500, and 750 m2/g were prepared. An experimental setup was designed and built and a simulation program using MATLAB was developed. Experimental tests were performed using inlet fluid temperatures of 30, 40, and 50 °C; flow rates of 0.6, 1.0, and 1.4 kg/min; and heat flux intensities of 600, 800, and 1000 W/m2. The FPSC’s efficiency increased as the flow rate and hea
... Show MoreThe estimation of the regular regression model requires several assumptions to be satisfied such as "linearity". One problem occurs by partitioning the regression curve into two (or more) parts and then joining them by threshold point(s). This situation is regarded as a linearity violation of regression. Therefore, the multiphase regression model is received increasing attention as an alternative approach which describes the changing of the behavior of the phenomenon through threshold point estimation. Maximum likelihood estimator "MLE" has been used in both model and threshold point estimations. However, MLE is not resistant against violations such as outliers' existence or in case of the heavy-tailed error distribution. The main goal of t
... Show MoreInternet of Things (IoT) contributes to improve the quality of life as it supports many applications, especially healthcare systems. Data generated from IoT devices is sent to the Cloud Computing (CC) for processing and storage, despite the latency caused by the distance. Because of the revolution in IoT devices, data sent to CC has been increasing. As a result, another problem added to the latency was increasing congestion on the cloud network. Fog Computing (FC) was used to solve these problems because of its proximity to IoT devices, while filtering data is sent to the CC. FC is a middle layer located between IoT devices and the CC layer. Due to the massive data generated by IoT devices on FC, Dynamic Weighted Round Robin (DWRR)
... Show MoreThis research aims to choose the appropriate probability distribution to the reliability analysis for an item through collected data for operating and stoppage time of the case study.
Appropriate choice for .probability distribution is when the data look to be on or close the form fitting line for probability plot and test the data for goodness of fit .
Minitab’s 17 software was used for this purpose after arranging collected data and setting it in the the program.
&nb
... Show MoreTwitter popularity has increasingly grown in the last few years, influencing life’s social, political, and business aspects. People would leave their tweets on social media about an event, and simultaneously inquire to see other people's experiences and whether they had a positive/negative opinion about that event. Sentiment Analysis can be used to obtain this categorization. Product reviews, events, and other topics from all users that comprise unstructured text comments are gathered and categorized as good, harmful, or neutral using sentiment analysis. Such issues are called polarity classifications. This study aims to use Twitter data about OK cuisine reviews obtained from the Amazon website and compare the effectiveness
... Show MoreWith the freedom offered by the Deep Web, people have the opportunity to express themselves freely and discretely, and sadly, this is one of the reasons why people carry out illicit activities there. In this work, a novel dataset for Dark Web active domains known as crawler-DB is presented. To build the crawler-DB, the Onion Routing Network (Tor) was sampled, and then a web crawler capable of crawling into links was built. The link addresses that are gathered by the crawler are then classified automatically into five classes. The algorithm built in this study demonstrated good performance as it achieved an accuracy of 85%. A popular text representation method was used with the proposed crawler-DB crossed by two different supervise
... Show MoreDue to the increased of information existing on the World Wide Web (WWW), the subject of how to extract new and useful knowledge from the log file has gained big interest among researchers in data mining and knowledge discovery topics.
Web miming, which is a subset of data mining divided into three particular ways, web content mining, web structure mining, web usage mining. This paper is interested in server log file, which is belonging to the third category (web usage mining). This file will be analyzed according to the suggested algorithm to extract the behavior of the user. Knowing the behavior is coming from knowing the complete path which is taken from the specific user.
Extracting these types of knowledge required many of KDD
Information security contributes directly to increase the level of trust between the government’s departments by providing an assurance of confidentiality, integrity, and availability of sensitive governmental information. Many threats that are caused mainly by malicious acts can shutdown the egovernment services. Therefore the governments are urged to implement security in e-government projects.
Some modifications were proposed to the security assessment multi-layer model (Sabri model) to be more comprehensive model and more convenient for the Iraqi government. The proposed model can be used as a tool to assess the level of security readiness of government departments, a checklist for the required security measures and as a commo