Cryptography algorithms play a critical role in information technology against various attacks witnessed in the digital era. Many studies and algorithms are done to achieve security issues for information systems. The high complexity of computational operations characterizes the traditional cryptography algorithms. On the other hand, lightweight algorithms are the way to solve most of the security issues that encounter applying traditional cryptography in constrained devices. However, a symmetric cipher is widely applied for ensuring the security of data communication in constraint devices. In this study, we proposed a hybrid algorithm based on two cryptography algorithms PRESENT and Salsa20. Also, a 2D logistic map of a chaotic system is applied to generate pseudo-random keys that produce more complexity for the proposed cipher algorithm. The goal of the proposed algorithm is to present a hybrid algorithm by enhancing the complexity of the current PRESENT algorithm while keeping the performance of computational operations as minimal. The proposed algorithm proved working efficiently with fast executed time, and the analyzed result of the generated sequence keys passed the randomness of the NIST suite.
In this paper, we applied the concept of the error analysis using the linearization method and new condition numbers constituting optimal bounds in appraisals of the possible errors. Evaluations of finite continued fractions, computations of determinates of tridiagonal systems, of determinates of second order and a "fast" complex multiplication. As in Horner's scheme, present rounding error analysis of product and summation algorithms. The error estimates are tested by numerical examples. The executed program for calculation is "MATLAB 7" from the website "Mathworks.com
The expanding use of multi-processor supercomputers has made a significant impact on the speed and size of many problems. The adaptation of standard Message Passing Interface protocol (MPI) has enabled programmers to write portable and efficient codes across a wide variety of parallel architectures. Sorting is one of the most common operations performed by a computer. Because sorted data are easier to manipulate than randomly ordered data, many algorithms require sorted data. Sorting is of additional importance to parallel computing because of its close relation to the task of routing data among processes, which is an essential part of many parallel algorithms. In this paper, sequential sorting algorithms, the parallel implementation of man
... Show MoreRecurrent strokes can be devastating, often resulting in severe disability or death. However, nearly 90% of the causes of recurrent stroke are modifiable, which means recurrent strokes can be averted by controlling risk factors, which are mainly behavioral and metabolic in nature. Thus, it shows that from the previous works that recurrent stroke prediction model could help in minimizing the possibility of getting recurrent stroke. Previous works have shown promising results in predicting first-time stroke cases with machine learning approaches. However, there are limited works on recurrent stroke prediction using machine learning methods. Hence, this work is proposed to perform an empirical analysis and to investigate machine learning al
... Show MoreThe present work aims to improve the flux of forward osmosis with the use of Thin Film Composite membrane by reducing the effect of polarization on draw solution (brine solution) side.This study was conducted in two parts. The first is under the effect of polarization in which the flux and the water permeability coefficient (A) were calculated. In the second part of the study the experiments were repeated using a circulating pump at various speeds to make turbulence and reduce the effect of polarization on the brine solution side.
A model capable of predicting water permeability coefficient has been derived, and this is given by the following equations:
Z=Z0 +C.R.T/9.8(d2/D2+1) [Exp. [-9.8(d
The development of low profile gamma-ray detectors has encouraged the production of small field of view (SFOV) hand-held imaging devices for use at the patient bedside and in operating theatres. Early development of these SFOV cameras was focussed on a single modality—gamma ray imaging. Recently, a hybrid system—gamma plus optical imaging—has been developed. This combination of optical and gamma cameras enables high spatial resolution multi-modal imaging, giving a superimposed scintigraphic and optical image. Hybrid imaging offers new possibilities for assisting clinicians and surgeons in localising the site of uptake in procedures such as sentinel node detection. The hybrid camera concept can be extended to a multimodal detec
... Show MoreIn this paper three techniques for image compression are implemented. The proposed techniques consist of three dimension (3-D) two level discrete wavelet transform (DWT), 3-D two level discrete multi-wavelet transform (DMWT) and 3-D two level hybrid (wavelet-multiwavelet transform) technique. Daubechies and Haar are used in discrete wavelet transform and Critically Sampled preprocessing is used in discrete multi-wavelet transform. The aim is to maintain to increase the compression ratio (CR) with respect to increase the level of the transformation in case of 3-D transformation, so, the compression ratio is measured for each level. To get a good compression, the image data properties, were measured, such as, image entropy (He), percent r
... Show MoreIn this paper three techniques for image compression are implemented. The proposed techniques consist of three dimension (3-D) two level discrete wavelet transform (DWT), 3-D two level discrete multi-wavelet transform (DMWT) and 3-D two level hybrid (wavelet-multiwavelet transform) technique. Daubechies and Haar are used in discrete wavelet transform and Critically Sampled preprocessing is used in discrete multi-wavelet transform. The aim is to maintain to increase the compression ratio (CR) with respect to increase the level of the transformation in case of 3-D transformation, so, the compression ratio is measured for each level. To get a good compression, the image data properties, were measured, such as, image entropy (He), percent root-
... Show More