The sensitive and important data are increased in the last decades rapidly, since the tremendous updating of networking infrastructure and communications. to secure this data becomes necessary with increasing volume of it, to satisfy securing for data, using different cipher techniques and methods to ensure goals of security that are integrity, confidentiality, and availability. This paper presented a proposed hybrid text cryptography method to encrypt a sensitive data by using different encryption algorithms such as: Caesar, Vigenère, Affine, and multiplicative. Using this hybrid text cryptography method aims to make the encryption process more secure and effective. The hybrid text cryptography method depends on circular queue. Using circular queue in this work allows to encrypt each character of plaintext by one of cipher methods which controlled by control key for selection process. The scheduling of above cipher methods implementation depends on control key. The experiment results of the proposed hybrid text cryptography method shows ability to cipher and decipher an important data with effectively and efficiently. Utilizing the properties of different cipher methods that are scheduled in circular queue (Caesar, Vigenère, Affine, and Multiplicative) leads to consume time less than other methods when they used alone, so, the important concepts in cryptography process such as complexity, integrity, execution time and security are meet in the proposed hybrid text cryptography method in ef ective manner.
Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show MoreThis paper describes the problem of online autonomous mobile robot path planning, which is consisted of finding optimal paths or trajectories for an autonomous mobile robot from a starting point to a destination across a flat map of a terrain, represented by a 2-D workspace. An enhanced algorithm for solving the problem of path planning using Bacterial Foraging Optimization algorithm is presented. This nature-inspired metaheuristic algorithm, which imitates the foraging behavior of E-coli bacteria, was used to find the optimal path from a starting point to a target point. The proposed algorithm was demonstrated by simulations in both static and dynamic different environments. A comparative study was evaluated between the developed algori
... Show MoreObjective: the aim of this study is to invest age and determine the effect of using (2) packing
technique (conventional and new tension technique) on hardness of (2) types of heat cure acrylic
resin which are (Ivoclar and Qual dental type).
Methodology : this study was intended the using of two types of heat cure acrylic (IVoclar and
Qual dental type) which are used in construction of complete denture which packed in two different
packing technique (conventional and new tension technique) and accomplished by using a total of
(40) specimens in diameter of ( 2mm thickness, 2 cm length and 1 cm width) . This specimens were
sectioned and subdivide into (4) group each (10) specimens for one group , then signed as (A, Al B
An analytical approach based on field data was used to determine the strength capacity of large diameter bored type piles. Also the deformations and settlements were evaluated for both vertical and lateral loadings. The analytical predictions are compared to field data obtained from a proto-type test pile used at Tharthar –Tigris canal Bridge. They were found to be with acceptable agreement of 12% deviation.
Following ASTM standards D1143M-07e1,2010, a test schedule of five loading cycles were proposed for vertical loads and series of cyclic loads to simulate horizontal loading .The load test results and analytical data of 1.95
... Show MoreThe catalytic activity of faujasite type NaY catalysts prepared from local clay (kaolin) with different Si/Al ratio was studied using cumene cracking as a model for catalytic cracking process in the temperature range of 450-525° C, weight hourly space velocity (WHSV) of 5-20 h1, particle size ≤75μm and atmospheric pressure. The catalytic activity was investigated using experimental laboratory plant scale of fluidized bed reactor.
It was found that the cumene conversion increases with increasing temperature and decreasing WHSV. At 525° C and WHSV 5 h-1, the conversion was 42.36 and 35.43 mol% for catalyst with 3.54 Si/Al ratio and Catalyst with 5.75 Si/Al ratio, respectively, while at 450° C and at the same WHSV, the conversion w
In this work, the elemental constituents of smoker and nonsmoker
teeth samples of human were analyzed by Laser induced breakdown
spectroscopy method (LIBS). Many elements have been detected in
the healthy teeth samples, the important once are Ca, P, Mg, Fe, Pb
and Na. Many differences were found between (female and male)
teeth in Ca, P, Mg, Na and Pb contents. The concentrations of most
toxic elements were found significantly in the smoker group. The
maximum concentrations of toxic elements such as Pb, Cd and Co
were found in older male age above 60 year. Also, it was found that
the minimum concentrations of trace elements such as Ca, P and Na
exist in this age group. From these results it is clear that the
To ascertain the stability or instability of time series, three versions of the model proposed by Dickie-Voller were used in this paper. The aim of this study is to explain the extent of the impact of some economic variables such as the supply of money, gross domestic product, national income, after reaching the stability of these variables. The results show that the variable money supply, the GDP variable, and the exchange rate variable were all stable at the level of the first difference in the time series. This means that the series is an integrated first-class series. Hence, the gross fixed capital formation variable, the variable national income, and the variable interest rate
... Show MoreBoth the double-differenced and zero-differenced GNSS positioning strategies have been widely used by the geodesists for different geodetic applications which are demanded for reliable and precise positions. A closer inspection of the requirements of these two GNSS positioning techniques, the zero-differenced positioning, which is known as Precise Point Positioning (PPP), has gained a special importance due to three main reasons. Firstly, the effective applications of PPP for geodetic purposes and precise applications depend entirely on the availability of the precise satellite products which consist of precise satellite orbital elements, precise satellite clock corrections, and Earth orientation parameters. Secondly, th
... Show MoreAbstract: Stars whose initial masses are between (0.89 - 8.0) M☉ go through an Asymptotic Giant Branch (AGB) phase at the end of their life. Which have been evolved from the main sequence phase through Asymptotic Giant Branch (AGB). The calculations were done by adopted Synthetic Model showed the following results: 1- Mass loss on the AGB phase consists of two phases for period (P <500) days and for (P>500) days; 2- the mass loss rate exponentially increases with the pulsation periods; 3- The expansion velocity VAGB for our stars are calculated according to the three assumptions; 4- the terminal velocity depends on several factors likes metallicity and luminosity. The calculations indicated that a super wind phase (S.W) developed on the A
... Show MoreBig data analysis has important applications in many areas such as sensor networks and connected healthcare. High volume and velocity of big data bring many challenges to data analysis. One possible solution is to summarize the data and provides a manageable data structure to hold a scalable summarization of data for efficient and effective analysis. This research extends our previous work on developing an effective technique to create, organize, access, and maintain summarization of big data and develops algorithms for Bayes classification and entropy discretization of large data sets using the multi-resolution data summarization structure. Bayes classification and data discretization play essential roles in many learning algorithms such a
... Show More