The advancements in Information and Communication Technology (ICT), within the previous decades, has significantly changed people’s transmit or store their information over the Internet or networks. So, one of the main challenges is to keep these information safe against attacks. Many researchers and institutions realized the importance and benefits of cryptography in achieving the efficiency and effectiveness of various aspects of secure communication.This work adopts a novel technique for secure data cryptosystem based on chaos theory. The proposed algorithm generate 2-Dimensional key matrix having the same dimensions of the original image that includes random numbers obtained from the 1-Dimensional logistic chaotic map for given control parameters, which is then processed by converting the fractional parts of them through a function into a set of non-repeating numbers that leads to a vast number of unpredicted probabilities (the factorial of rows times columns). Double layers of rows and columns permutation are made to the values of numbers for a specified number of stages. Then, XOR is performed between the key matrix and the original image, which represent an active resolve for data encryption for any type of files (text, image, audio, video, … etc). The results proved that the proposed encryption technique is very promising when tested on more than 500 image samples according to security measurements where the histograms of cipher images are very flatten compared with that for original images, while the averages of Mean Square Error is very high (10115.4) and Peak Signal to Noise Ratio is very low (8.17), besides Correlation near zero and Entropy close to 8 (7.9975).
Heart disease is a significant and impactful health condition that ranks as the leading cause of death in many countries. In order to aid physicians in diagnosing cardiovascular diseases, clinical datasets are available for reference. However, with the rise of big data and medical datasets, it has become increasingly challenging for medical practitioners to accurately predict heart disease due to the abundance of unrelated and redundant features that hinder computational complexity and accuracy. As such, this study aims to identify the most discriminative features within high-dimensional datasets while minimizing complexity and improving accuracy through an Extra Tree feature selection based technique. The work study assesses the efficac
... Show MoreIn data mining, classification is a form of data analysis that can be used to extract models describing important data classes. Two of the well known algorithms used in data mining classification are Backpropagation Neural Network (BNN) and Naïve Bayesian (NB). This paper investigates the performance of these two classification methods using the Car Evaluation dataset. Two models were built for both algorithms and the results were compared. Our experimental results indicated that the BNN classifier yield higher accuracy as compared to the NB classifier but it is less efficient because it is time-consuming and difficult to analyze due to its black-box implementation.
Enhancing quality image fusion was proposed using new algorithms in auto-focus image fusion. The first algorithm is based on determining the standard deviation to combine two images. The second algorithm concentrates on the contrast at edge points and correlation method as the criteria parameter for the resulted image quality. This algorithm considers three blocks with different sizes at the homogenous region and moves it 10 pixels within the same homogenous region. These blocks examine the statistical properties of the block and decide automatically the next step. The resulted combined image is better in the contras
... Show MoreBackground: Glass ionomer restorations are widely employed in the field of pediatric dentistry. There is a constant demand for a durable restoration that remains functional until exfoliation. This study aimed to measure and compare the effect of a novel coating material (EQUIA Forte Coat) on the microleakage of glass hybrid restoration (EQUIA Forte HT) in primary teeth. Material and method: Thirty cavitated (class-II) primary molars were allocated randomly into two groups based on the coat application; uncoated (control) and coated group (experimental). Cavities were prepared by the use of a ceramic bur (CeraBur) and restored with EQUIA Forte HT with or without applying a protective coat (EQUIA Forte Coat). Samples went through the
... Show MoreBackground: Glass ionomer restorations are widely employed in the field of pediatric dentistry. There is a constant demand for a durable restoration that remains functional until exfoliation. This study aimed to measure and compare the effect of a novel coating material (EQUIA Forte Coat) on the microleakage of glass hybrid restoration (EQUIA Forte HT) in primary teeth. Material and method: Thirty cavitated (class-II) primary molars were allocated randomly into two groups based on the coat application; uncoated (control) and coated group (experimental). Cavities were prepared by the use of a ceramic bur (CeraBur) and restored with EQUIA Forte HT with or without applying a protective coat (EQUIA Forte Coat). Samples went through the therm
... Show More
In this work, a novel technique to obtain an accurate solutions to nonlinear form by multi-step combination with Laplace-variational approach (MSLVIM) is introduced. Compared with the traditional approach for variational it overcome all difficulties and enable to provide us more an accurate solutions with extended of the convergence region as well as covering to larger intervals which providing us a continuous representation of approximate analytic solution and it give more better information of the solution over the whole time interval. This technique is more easier for obtaining the general Lagrange multiplier with reduces the time and calculations. It converges rapidly to exact formula with simply computable terms wit
... Show MoreABSTRUCT
In This Paper, some semi- parametric spatial models were estimated, these models are, the semi – parametric spatial error model (SPSEM), which suffer from the problem of spatial errors dependence, and the semi – parametric spatial auto regressive model (SPSAR). Where the method of maximum likelihood was used in estimating the parameter of spatial error ( λ ) in the model (SPSEM), estimated the parameter of spatial dependence ( ρ ) in the model ( SPSAR ), and using the non-parametric method in estimating the smoothing function m(x) for these two models, these non-parametric methods are; the local linear estimator (LLE) which require finding the smoo
... Show MoreContinuous turbidimetric analysis (CTA) for a distinctive analytical application by employing a homemade analyser (NAG Dual & Solo 0-180°) which contained two consecutive detection zones (measuring cells 1 & 2) is described. The analyser works based on light-emitting diodes as a light source and a set of solar cells as a light detector for turbidity measurements without needing further fibres or lenses. Formation of a turbid precipitated product with yellow colour due to the reaction between the warfarin and the precipitation reagent (Potassium dichromate) is what the developed method is based on. The CTA method was applied to determine the warfarin in pure form and pharmaceu