This paper proposes a new encryption method. It combines two cipher algorithms, i.e., DES and AES, to generate hybrid keys. This combination strengthens the proposed W-method by generating high randomized keys. Two points can represent the reliability of any encryption technique. Firstly, is the key generation; therefore, our approach merges 64 bits of DES with 64 bits of AES to produce 128 bits as a root key for all remaining keys that are 15. This complexity increases the level of the ciphering process. Moreover, it shifts the operation one bit only to the right. Secondly is the nature of the encryption process. It includes two keys and mixes one round of DES with one round of AES to reduce the performance time. The W-method deals with Arabic and English texts with the same efficiency. The result showed that the proposed method performs faster and more securely when compared to standard DES and AES algorithms.
Background: Radiopacity is one of the prerequisites for dental materials, especially for composite restorations. It's essential for easy detection of secondary dental caries as well as observation of the radiographic interface between the materials and tooth structure. The aim of this study to assess the difference in radiopacity of different resin composites using a digital x-ray system. Materials and methods: Ten specimens (6mm diameter and 1mm thickness) of three types of composite resins (Evetric, Estelite Sigma Quick,and G-aenial) were fabricated using Teflon mold. The radiopacity was assessed using dental radiography equipment in combination with a phosphor plate digital system and a grey scale value aluminum step wedge with thickness
... Show MoreBackground: Radiopacity is one of the prerequisites for dental materials, especially for composite restorations. It's essential for easy detection of secondary dental caries as well as observation of the radiographic interface between the materials and tooth structure. The aim of this study to assess the difference in radiopacity of different resin composites using a digital x-ray system. Materials and methods: Ten specimens (6mm diameter and 1mm thickness) of three types of composite resins (Evetric, Estelite Sigma Quick,and G-aenial) were fabricated using Teflon mold. The radiopacity was assessed using dental radiography equipment in combination with a phosphor plate digital system and a grey scale value aluminum step wedge with thickness
... Show MoreIn this study, a mathematical model is presented to study the chemisorption of two interacting atoms on solid surface in the presence of laser field. Our mathematical model is based on the occupation numbers formula that depends on the laser field which we derived according to Anderson model for single atom adsorbed on solid surface. Occupation numbers formula and chemisorption energy formula are derived for two interacting atoms (as a diatomic molecule) as they approach to the surface taking into account the correlation effects on each atom and between atoms. This model is characterized by obvious dependence of all relations on the system variables and the laser field characteristics which gives precise description for the molecule –
... Show MoreIn this paper, a microcontroller-based electronic circuit have been designed and implemented for dental curing system using 8-bit MCS-51 microcontroller. Also a new control card is designed while considering advantages of microcontroller systems the time of curing was controlled automatically by preset values which were input from a push-button switch. An ignition based on PWM technique was used to reduce the high starting current needed for the halogen lamp. This paper and through the test result will show a good performance of the proposed system.
Abstract
The Classical Normal Linear Regression Model Based on Several hypotheses, one of them is Heteroscedasticity as it is known that the wing of least squares method (OLS), under the existence of these two problems make the estimators, lose their desirable properties, in addition the statistical inference becomes unaccepted table. According that we put tow alternative, the first one is (Generalized Least Square) Which is denoted by (GLS), and the second alternative is to (Robust covariance matrix estimation) the estimated parameters method(OLS), and that the way (GLS) method neat and certified, if the capabilities (Efficient) and the statistical inference Thread on the basis of an acceptable
... Show MoreThe achievements of the art that we know today are questioned in motives that differ from what art knew before, including dramatic artistic transformations, which he called modern art.
In view of the enormity of such a topic, its ramifications and its complexity, it was necessary to confine its subject to the origin of the motives of the transformations of its first pioneers, and then to stand on what resulted from that of the data of vision in composition and drawing exclusively, and through exploration in that, we got to know the vitality of change from the art of its time.
And by examining the ruling contemporary philosophical concepts and their new standards and their epistemological role in contemporary life, since they includ
<span>Dust is a common cause of health risks and also a cause of climate change, one of the most threatening problems to humans. In the recent decade, climate change in Iraq, typified by increased droughts and deserts, has generated numerous environmental issues. This study forecasts dust in five central Iraqi districts using machine learning and five regression algorithm supervised learning system framework. It was assessed using an Iraqi meteorological organization and seismology (IMOS) dataset. Simulation results show that the gradient boosting regressor (GBR) has a mean square error of 8.345 and a total accuracy ratio of 91.65%. Moreover, the results show that the decision tree (DT), where the mean square error is 8.965, c
... Show MoreAbstract— The growing use of digital technologies across various sectors and daily activities has made handwriting recognition a popular research topic. Despite the continued relevance of handwriting, people still require the conversion of handwritten copies into digital versions that can be stored and shared digitally. Handwriting recognition involves the computer's strength to identify and understand legible handwriting input data from various sources, including document, photo-graphs and others. Handwriting recognition pose a complexity challenge due to the diversity in handwriting styles among different individuals especially in real time applications. In this paper, an automatic system was designed to handwriting recognition
... Show MoreIn this paper, we will provide a proposed method to estimate missing values for the Explanatory variables for Non-Parametric Multiple Regression Model and compare it with the Imputation Arithmetic mean Method, The basis of the idea of this method was based on how to employ the causal relationship between the variables in finding an efficient estimate of the missing value, we rely on the use of the Kernel estimate by Nadaraya – Watson Estimator , and on Least Squared Cross Validation (LSCV) to estimate the Bandwidth, and we use the simulation study to compare between the two methods.
In this paper, an estimate has been made for parameters and the reliability function for Transmuted power function (TPF) distribution through using some estimation methods as proposed new technique for white, percentile, least square, weighted least square and modification moment methods. A simulation was used to generate random data that follow the (TPF) distribution on three experiments (E1 , E2 , E3) of the real values of the parameters, and with sample size (n=10,25,50 and 100) and iteration samples (N=1000), and taking reliability times (0< t < 0) . Comparisons have been made between the obtained results from the estimators using mean square error (MSE). The results showed the
... Show More