One of the most difficult issues in the history of communication technology is the transmission of secure images. On the internet, photos are used and shared by millions of individuals for both private and business reasons. Utilizing encryption methods to change the original image into an unintelligible or scrambled version is one way to achieve safe image transfer over the network. Cryptographic approaches based on chaotic logistic theory provide several new and promising options for developing secure Image encryption methods. The main aim of this paper is to build a secure system for encrypting gray and color images. The proposed system consists of two stages, the first stage is the encryption process, in which the keys are generated depending on the chaotic logistic with the image density to encrypt the gray and color images, and the second stage is the decryption, which is the opposite of the encryption process to obtain the original image. The proposed method has been tested on two standard gray and color images publicly available. The test results indicate to the highest value of peak signal-to-noise ratio (PSNR), unified average changing intensity (UACI), number of pixel change rate (NPCR) are 7.7268, 50.2011 and 100, respectively. While the encryption and decryption speed up to 0.6319 and 0.5305 second respectively.
In this research, we studied the multiple linear regression models for two variables in the presence of the autocorrelation problem for the error term observations and when the error is distributed with general logistic distribution. The auto regression model is involved in the studying and analyzing of the relationship between the variables, and through this relationship, the forecasting is completed with the variables as values. A simulation technique is used for comparison methods depending
Uncompressed form of the digital images are needed a very large storage capacity amount, as a consequence requires large communication bandwidth for data transmission over the network. Image compression techniques not only minimize the image storage space but also preserve the quality of image. This paper reveal image compression technique which uses distinct image coding scheme based on wavelet transform that combined effective types of compression algorithms for further compression. EZW and SPIHT algorithms are types of significant compression techniques that obtainable for lossy image compression algorithms. The EZW coding is a worthwhile and simple efficient algorithm. SPIHT is an most powerful technique that utilize for image
... Show MoreA new blind restoration algorithm is presented and shows high quality restoration. This
is done by enforcing Wiener filtering approach in the Fourier domains of the image and the
psf environments
Researchers used different methods such as image processing and machine learning techniques in addition to medical instruments such as Placido disc, Keratoscopy, Pentacam;to help diagnosing variety of diseases that affect the eye. Our paper aims to detect one of these diseases that affect the cornea, which is Keratoconus. This is done by using image processing techniques and pattern classification methods. Pentacam is the device that is used to detect the cornea’s health; it provides four maps that can distinguish the changes on the surface of the cornea which can be used for Keratoconus detection. In this study, sixteen features were extracted from the four refractive maps along with five readings from the Pentacam software. The
... Show MoreThe purpose of this paper is to study the properties of the
partial level density ( ) l g and the total level density g ( ),
numerically obtained as a l sum of ( ) l g up to 34 max l , for
a Harmonic – Oscillator potential well. This method applied the
quantum – mechanical phase shift technique and concentrated
on the continuum region. Also a discussion of peculiarities of
quantal calculation for single particle level density of energy –
dependent potential
Variable selection is an essential and necessary task in the statistical modeling field. Several studies have triedto develop and standardize the process of variable selection, but it isdifficultto do so. The first question a researcher needs to ask himself/herself what are the most significant variables that should be used to describe a given dataset’s response. In thispaper, a new method for variable selection using Gibbs sampler techniqueshas beendeveloped.First, the model is defined, and the posterior distributions for all the parameters are derived.The new variable selection methodis tested usingfour simulation datasets. The new approachiscompared with some existingtechniques: Ordinary Least Squared (OLS), Least Absolute Shrinkage
... Show MoreBackground: Mini implant stability is primarily related to local bone density; no studies have evaluated bone density related to mini implant placement for orthodontic anchorage between different age groups in the maxilla and the mandible. The present research aims to evaluate side, gender, age, and regional differences in bone density of the alveolar bone at various orthodontic implant sites. Materials and method: Fifty three individuals who were divided into two groups according to their age into: group I (ages 16-20 years) and group II (ages 21-29 years) had subjected to clinical examination, then 64-multislice computed tomography scan data were evaluated and bone density was measured in Hounsfield unit at 102 points (51 in the maxilla
... Show MoreThe ground-state properties of exotic 18N and 20F nuclei, including the neutron, proton and matter densities and related radii are investigated using the two-body model of within Gaussian (GS) and Woods Saxon (WS) wave functions. The long tail is evident in the computed neutron and matter densities of these nuclei. The plane wave Born approximation (PWBA) is calculate the elastic form factors of these exotic nuclei. The variation in the proton density distributions due to the presence of the extra neutrons in 18N and 20F leads to a major difference between the elastic form factors of these exotic nuclei and their stable isotopes 14N and 19F. The reaction c
... Show MoreAudio classification is the process to classify different audio types according to contents. It is implemented in a large variety of real world problems, all classification applications allowed the target subjects to be viewed as a specific type of audio and hence, there is a variety in the audio types and every type has to be treatedcarefully according to its significant properties.Feature extraction is an important process for audio classification. This workintroduces several sets of features according to the type, two types of audio (datasets) were studied. Two different features sets are proposed: (i) firstorder gradient feature vector, and (ii) Local roughness feature vector, the experimentsshowed that the results are competitive to
... Show More