In the current digitalized world, cloud computing becomes a feasible solution for the virtualization of cloud computing resources. Though cloud computing has many advantages to outsourcing an organization’s information, but the strong security is the main aspect of cloud computing. Identity authentication theft becomes a vital part of the protection of cloud computing data. In this process, the intruders violate the security protocols and perform attacks on the organizations or user’s data. The situation of cloud data disclosure leads to the cloud user feeling insecure while using the cloud platform. The different traditional cryptographic techniques are not able to stop such kinds of attacks. BB84 protocol is the first quantum cryptography protocol developed by Bennett and Brassard in the year 1984. In the present work, three ways BB84GA security systems have been demonstrated using trusted cryptographic techniques like an attribute-based authentication system, BB84 protocol, and genetic algorithm. Firstly, attribute-based authentication is used for identity-based access control and thereafter BB84 protocol is used for quantum key distribution between both parties and later the concept of genetic algorithm is applied for encryption/decryption of sensitive information across the private/public clouds. The proposed concept of involvement of hybrid algorithms is highly secure and technologically feasible. It is a unique algorithm which may be used to minimize the security threats over the clouds. The computed results are presented in the form of tables and graphs.
The best design of subsurface trickle irrigation systems requires knowledge of water and salt distribution patterns around the emitters that match the root extraction and minimize water losses. The transient distribution of water and salt in a two-dimensional homogeneous Iraqi soil domain under subsurface trickle irrigation with different settings of an emitter is investigated numerically using 2D-HYDRUS software. Three types of Iraqi soil were selected. The effect of altering different values of water application rate and initial soil water content was investigated in the developed model. The coefficient of correlation (R2) and the root-mean-square error (RMSE) was used to validate the predicted numerical res
... Show MoreIn this paper, we proposed a new class of weighted Rayleigh distribution based on two parameters, scale and shape parameters which are introduced in Rayleigh distribution. The main properties of this class are investigated and derived.
The accretion circumstellar disk of young stars and the Brown dwarf plays an essential role in the formation and evaluation of the planet. Our main work in this paper is to investigate the geometrical shape model for the protoplanetary disk around one of the Brown Dwarfs. The photometric measurements for the brown dwarf CFHT-BD-Tau 4 were extracted from the Vizier archive. We used a numerical simulation to build a model of the spectral energy distribution of our target CFHT-BD-Tau 4. The spectral energy distribution model was fitted with observational data for the brown dwarf CFHT-BD-Tau 4. A transitional disk has been assumed around CFHT-BD-Tau 4. We obtained physical properties of the two disks and the size of the gap between them
... Show MoreThe estimation of the regular regression model requires several assumptions to be satisfied such as "linearity". One problem occurs by partitioning the regression curve into two (or more) parts and then joining them by threshold point(s). This situation is regarded as a linearity violation of regression. Therefore, the multiphase regression model is received increasing attention as an alternative approach which describes the changing of the behavior of the phenomenon through threshold point estimation. Maximum likelihood estimator "MLE" has been used in both model and threshold point estimations. However, MLE is not resistant against violations such as outliers' existence or in case of the heavy-tailed error distribution. The main goal of t
... Show MoreOne of the costliest problems facing the production of hydrocarbons in unconsolidated sandstone reservoirs is the production of sand once hydrocarbon production starts. The sanding start prediction model is very important to decide on sand control in the future, including whether or when sand control should be used. This research developed an easy-to-use Computer program to determine the beginning of sanding sites in the driven area. The model is based on estimating the critical pressure drop that occurs when sand is onset to produced. The outcomes have been drawn as a function of the free sand production with the critical flow rates for reservoir pressure decline. The results show that the pressure drawdown required to
... Show MoreInternet of Things (IoT) contributes to improve the quality of life as it supports many applications, especially healthcare systems. Data generated from IoT devices is sent to the Cloud Computing (CC) for processing and storage, despite the latency caused by the distance. Because of the revolution in IoT devices, data sent to CC has been increasing. As a result, another problem added to the latency was increasing congestion on the cloud network. Fog Computing (FC) was used to solve these problems because of its proximity to IoT devices, while filtering data is sent to the CC. FC is a middle layer located between IoT devices and the CC layer. Due to the massive data generated by IoT devices on FC, Dynamic Weighted Round Robin (DWRR)
... Show MoreSteganography is a technique to hide a secret message within a different multimedia carrier so that the secret message cannot be identified. The goals of steganography techniques include improvements in imperceptibility, information hiding, capacity, security, and robustness. In spite of numerous secure methodologies that have been introduced, there are ongoing attempts to develop these techniques to make them more secure and robust. This paper introduces a color image steganographic method based on a secret map, namely 3-D cat. The proposed method aims to embed data using a secure structure of chaotic steganography, ensuring better security. Rather than using the complete image for data hiding, the selection of
... Show MoreIn this paper we present the theoretical foundation of forward error analysis of numerical algorithms under;• Approximations in "built-in" functions.• Rounding errors in arithmetic floating-point operations.• Perturbations of data.The error analysis is based on linearization method. The fundamental tools of the forward error analysis are system of linear absolute and relative a prior and a posteriori error equations and associated condition numbers constituting optimal of possible cumulative round – off errors. The condition numbers enable simple general, quantitative bounds definitions of numerical stability. The theoretical results have been applied a Gaussian elimination, and have proved to be very effective means of both a prior
... Show MoreTwitter popularity has increasingly grown in the last few years, influencing life’s social, political, and business aspects. People would leave their tweets on social media about an event, and simultaneously inquire to see other people's experiences and whether they had a positive/negative opinion about that event. Sentiment Analysis can be used to obtain this categorization. Product reviews, events, and other topics from all users that comprise unstructured text comments are gathered and categorized as good, harmful, or neutral using sentiment analysis. Such issues are called polarity classifications. This study aims to use Twitter data about OK cuisine reviews obtained from the Amazon website and compare the effectiveness
... Show MoreWith the freedom offered by the Deep Web, people have the opportunity to express themselves freely and discretely, and sadly, this is one of the reasons why people carry out illicit activities there. In this work, a novel dataset for Dark Web active domains known as crawler-DB is presented. To build the crawler-DB, the Onion Routing Network (Tor) was sampled, and then a web crawler capable of crawling into links was built. The link addresses that are gathered by the crawler are then classified automatically into five classes. The algorithm built in this study demonstrated good performance as it achieved an accuracy of 85%. A popular text representation method was used with the proposed crawler-DB crossed by two different supervise
... Show More