In the current digitalized world, cloud computing becomes a feasible solution for the virtualization of cloud computing resources. Though cloud computing has many advantages to outsourcing an organization’s information, but the strong security is the main aspect of cloud computing. Identity authentication theft becomes a vital part of the protection of cloud computing data. In this process, the intruders violate the security protocols and perform attacks on the organizations or user’s data. The situation of cloud data disclosure leads to the cloud user feeling insecure while using the cloud platform. The different traditional cryptographic techniques are not able to stop such kinds of attacks. BB84 protocol is the first quantum cryptography protocol developed by Bennett and Brassard in the year 1984. In the present work, three ways BB84GA security systems have been demonstrated using trusted cryptographic techniques like an attribute-based authentication system, BB84 protocol, and genetic algorithm. Firstly, attribute-based authentication is used for identity-based access control and thereafter BB84 protocol is used for quantum key distribution between both parties and later the concept of genetic algorithm is applied for encryption/decryption of sensitive information across the private/public clouds. The proposed concept of involvement of hybrid algorithms is highly secure and technologically feasible. It is a unique algorithm which may be used to minimize the security threats over the clouds. The computed results are presented in the form of tables and graphs.
The study aims to diagnose the levels of total costs borne by the Diyala State Company, then estimate and analyze the quantitative relationship between the different items of these costs, in addition to the impact of the productive activity on them. This was done by choosing the different variables affecting the costs and their different items for the company under study, and relying on the data issued by the company during the period (2002-2021), based on a methodology that combines the descriptive and econometric methods in order to estimate and analyze the cost function in the concerned company. According to the estimated function of the costs of the company under study, the study concluded that the value of production affects the total
... Show MoreThe properties of capturing of peristaltic flow to a chemically reacting couple stress fluid through an inclined asymmetric channel with variable viscosity and various boundaries are investigated. we have addressed the impacts of variable viscosity, different wave forms, porous medium, heat and mass transfer for peristaltic transport of hydro magnetic couple stress liquid in inclined asymmetric channel with different boundaries. Moreover, The Fluid viscosity assumed to vary as an exponential function of temperature. Effects of almost flow parameters are studied analytically and computed. An rising in the temperature and concentration profiles return to heat and mass transfer Biot numbers. Noteworthy, the Soret and Dufour number effect resul
... Show MoreAbstract
There has been a heated controversy over the role the financial policy plays and how sufficient it is in affording the financial burden. This burden is known as the operational current expenses which the governments of various countries mainly afford, despite the discrepancy in the government’s economic policy. After the deterioration and deficit in the state budget in all countries nowadays, it was necessary to find an appropri
... Show MoreEstimation the unknown parameters of a two-dimensional sinusoidal signal model is an important and a difficult problem , The importance of this model in modeling Symmetric gray- scale texture image . In this paper, we propose employment Deferential Evaluation algorithm and the use of Sequential approach to estimate the unknown frequencies and amplitudes of the 2-D sinusoidal components when the signal is affected by noise. Numerical simulation are performed for different sample size, and various level of standard deviation to observe the performance of this method in estimate the parameters of 2-D sinusoidal signal model , This model was used for modeling the Symmetric gray scale texture image and estimating by using
... Show MoreThis study was carried out to measure the percentage of heavy metals pollution in the water of the Diyala river and to measure the percentage of contamination of these elements in the leafy vegetables grown on both sides of the Diyala river, which are irrigated by the contaminated river water (celery, radish, lepidium, green onions, beta vulgaris subsp, and malva). Laboratory analysis was achieved to measure the ratio of heavy element contamination (Pb, Fe, Ni, Cd, Zn and Cr) using flame atomic absorption spectrophotometer during the summer months of July and August for the year 2017. The study showed that the elements of zinc, chromium, nickel and cadmium were high concentrations and exceeded. The maximum concentration of these
... Show MoreIn this study, we focused on the random coefficient estimation of the general regression and Swamy models of panel data. By using this type of data, the data give a better chance of obtaining a better method and better indicators. Entropy's methods have been used to estimate random coefficients for the general regression and Swamy of the panel data which were presented in two ways: the first represents the maximum dual Entropy and the second is general maximum Entropy in which a comparison between them have been done by using simulation to choose the optimal methods.
The results have been compared by using mean squares error and mean absolute percentage error to different cases in term of correlation valu
... Show MoreBackground: Appreciation of the crucial role of risk factors in the development of coronary artery disease (CAD) is one of the most significant advances in the understanding of this important disease. Extensive epidemiological research has established cigarette smoking, diabetes, hyperlipidemia, and hypertension as independent risk factors for CADObjective: To determine the prevalence of the 4 conventional risk factors(cigarette smoking, diabetes, hyperlipidemia, and hypertension) among patients with CAD and to determine the correlation of Thrombolysis in Myocardial Infarction (TIMI) risk score with the extent of coronary artery disease (CAD) in patients with unstable angina /non ST elevation myocardial infarction (UA/NSTEMI).Methods: We
... Show MoreThe objective of the study is to demonstrate the predictive ability is better between the logistic regression model and Linear Discriminant function using the original data first and then the Home vehicles to reduce the dimensions of the variables for data and socio-economic survey of the family to the province of Baghdad in 2012 and included a sample of 615 observation with 13 variable, 12 of them is an explanatory variable and the depended variable is number of workers and the unemployed.
Was conducted to compare the two methods above and it became clear by comparing the logistic regression model best of a Linear Discriminant function written
... Show MoreIn this research want to make analysis for some indicators and it's classifications that related with the teaching process and the scientific level for graduate studies in the university by using analysis of variance for ranked data for repeated measurements instead of the ordinary analysis of variance . We reach many conclusions for the
important classifications for each indicator that has affected on the teaching process. &nb
... Show MoreMultiple eliminations (de-multiple) are one of seismic processing steps to remove their effects and delineate the correct primary refractors. Using normal move out to flatten primaries is the way to eliminate multiples through transforming these data to frequency-wavenumber domain. The flatten primaries are aligned with zero axis of the frequency-wavenumber domain and any other reflection types (multiples and random noise) are distributed elsewhere. Dip-filter is applied to pass the aligned data and reject others will separate primaries from multiple after transforming the data back from frequency-wavenumber domain to time-distance domain. For that, a suggested name for this technique as normal move out- frequency-wavenumber domain
... Show More