In the current digitalized world, cloud computing becomes a feasible solution for the virtualization of cloud computing resources. Though cloud computing has many advantages to outsourcing an organization’s information, but the strong security is the main aspect of cloud computing. Identity authentication theft becomes a vital part of the protection of cloud computing data. In this process, the intruders violate the security protocols and perform attacks on the organizations or user’s data. The situation of cloud data disclosure leads to the cloud user feeling insecure while using the cloud platform. The different traditional cryptographic techniques are not able to stop such kinds of attacks. BB84 protocol is the first quantum cryptography protocol developed by Bennett and Brassard in the year 1984. In the present work, three ways BB84GA security systems have been demonstrated using trusted cryptographic techniques like an attribute-based authentication system, BB84 protocol, and genetic algorithm. Firstly, attribute-based authentication is used for identity-based access control and thereafter BB84 protocol is used for quantum key distribution between both parties and later the concept of genetic algorithm is applied for encryption/decryption of sensitive information across the private/public clouds. The proposed concept of involvement of hybrid algorithms is highly secure and technologically feasible. It is a unique algorithm which may be used to minimize the security threats over the clouds. The computed results are presented in the form of tables and graphs.
Abstract
The problem of missing data represents a major obstacle before researchers in the process of data analysis in different fields since , this problem is a recurrent one in all fields of study including social , medical , astronomical and clinical experiments .
The presence of such a problem within the data to be studied may influence negatively on the analysis and it may lead to misleading conclusions , together with the fact that these conclusions that result from a great bias caused by that problem in spite of the efficiency of wavelet methods but they are also affected by the missing of data , in addition to the impact of the problem of miss of accuracy estimation
... Show MoreIn Australia, most of the existing buildings were designed before the release of the Australian standard for earthquake actions in 2007. Therefore, many existing buildings in Australia lack adequate seismic design, and their seismic performance must be assessed. The recent earthquake that struck Mansfield, Victoria near Melbourne elevated the need to produce fragility curves for existing reinforced concrete (RC) buildings in Australia. Fragility curves are frequently utilized to assess buildings’ seismic performance and it is defined as the demand probability surpassing capacity at a given intensity level. Numerous factors can influence the results of the fragility assessment of RC buildings. Among the most important factors that can affe
... Show MoreThis study includes the preparation of the ferrite nanoparticles CuxCe0.3-XNi0.7Fe2O4 (where: x = 0, 0.05, 0.1, 0.15, 0.2, 0.25, 0.3) using the sol-gel (auto combustion) method, and citric acid was used as a fuel for combustion. The results of the tests conducted by X-ray diffraction (XRD), emitting-field scanning electron microscopy (FE-SEM), energy-dispersive X-ray analyzer (EDX), and Vibration Sample Magnetic Device (VSM) showed that the compound has a face-centered cubic structure, and the lattice constant is increased with increasing Cu ion. On the other hand, the compound has apparent porosity and spherical particles, and t
... Show MoreIn this research , we study the inverse Gompertz distribution (IG) and estimate the survival function of the distribution , and the survival function was evaluated using three methods (the Maximum likelihood, least squares, and percentiles estimators) and choosing the best method estimation ,as it was found that the best method for estimating the survival function is the squares-least method because it has the lowest IMSE and for all sample sizes
In this paper, the packing problem for complete ( 4)-arcs in is partially solved. The minimum and the maximum sizes of complete ( 4)-arcs in are obtained. The idea that has been used to do this classification is based on using the algorithm introduced in Section 3 in this paper. Also, this paper establishes the connection between the projective geometry in terms of a complete ( , 4)-arc in and the algebraic characteristics of a plane quartic curve over the field represented by the number of its rational points and inflexion points. In addition, some sizes of complete ( 6)-arcs in the projective plane of order thirteen are established, namely for = 53, 54, 55, 56.
Many of the key stream generators which are used in practice are LFSR-based in the sense that they produce the key stream according to a rule y = C(L(x)), where L(x) denotes an internal linear bit stream, produced by small number of parallel linear feedback shift registers (LFSRs), and C denotes some nonlinear compression function. In this paper we combine between the output sequences from the linear feedback shift registers with the sequences out from non linear key generator to get the final very strong key sequence
Most companies use social media data for business. Sentiment analysis automatically gathers analyses and summarizes this type of data. Managing unstructured social media data is difficult. Noisy data is a challenge to sentiment analysis. Since over 50% of the sentiment analysis process is data pre-processing, processing big social media data is challenging too. If pre-processing is carried out correctly, data accuracy may improve. Also, sentiment analysis workflow is highly dependent. Because no pre-processing technique works well in all situations or with all data sources, choosing the most important ones is crucial. Prioritization is an excellent technique for choosing the most important ones. As one of many Multi-Criteria Decision Mak
... Show MoreThe 3-parameter Weibull distribution is used as a model for failure since this distribution is proper when the failure rate somewhat high in starting operation and these rates will be decreased with increasing time .
In practical side a comparison was made between (Shrinkage and Maximum likelihood) Estimators for parameter and reliability function using simulation , we conclude that the Shrinkage estimators for parameters are better than maximum likelihood estimators but the maximum likelihood estimator for reliability function is the better using statistical measures (MAPE)and (MSE) and for different sample sizes.
Note:- ns : small sample ; nm=median sample
... Show MoreIndustrial characteristics calculations concentrated on the physical properties for break down voltage in sf6, cf4 gases and their mixture with different concentrations are presented in our work. Calculations are achieved by using an improved modern code simulated on windows technique. Our results give rise to a compatible agreement with the other experimental published data.
Multiple linear regressions are concerned with studying and analyzing the relationship between the dependent variable and a set of explanatory variables. From this relationship the values of variables are predicted. In this paper the multiple linear regression model and three covariates were studied in the presence of the problem of auto-correlation of errors when the random error distributed the distribution of exponential. Three methods were compared (general least squares, M robust, and Laplace robust method). We have employed the simulation studies and calculated the statistical standard mean squares error with sample sizes (15, 30, 60, 100). Further we applied the best method on the real experiment data representing the varieties of
... Show More