The denoising of a natural image corrupted by Gaussian noise is a problem in signal or image processing. Much work has been done in the field of wavelet thresholding but most of it was focused on statistical modeling of wavelet coefficients and the optimal choice of thresholds. This paper describes a new method for the suppression of noise in image by fusing the stationary wavelet denoising technique with adaptive wiener filter. The wiener filter is applied to the reconstructed image for the approximation coefficients only, while the thresholding technique is applied to the details coefficients of the transform, then get the final denoised image is obtained by combining the two results. The proposed method was applied by using MATLAB R2010a with color images contaminated by white Gaussian noise. Compared with stationary wavelet and wiener filter algorithms, the experimental results show that the proposed method provides better subjective and objective quality, and obtain up to 3.5 dB PSNR improvement.
Background This study aimed to evaluate the efficacy of once-daily liraglutide as an add-on to oral antidiabetics (OADs) on glycemic control and body weight in obese patients with inadequately controlled type 2 diabetes (T2D). Methods A total of 27 obese T2D patients who received 7 months (0.6 mg/day for the first month, 1.2 mg/day for 3 months, and 1.8 mg/day for 3 months) of liraglutide treatment as an add-on to OADs were included. Data on body weight (kg), fasting plasma glucose (FPG, mg/dL), postprandial glucose (PPG, mg/dL), and HbA1c (%), were recorded. Results Liraglutide doses of 1.2 mg/day and 1.8 mg/day were associated with significant decreases in body weight (by 8.0% and 11.9%, respectively, p < 0.01 for each) and HbA1c (by 20.0
... Show MoreRestoration is the main process in many applications. Restoring an original image from a damaged image is the foundation of the restoring operation, either blind or non-blind. One of the main challenges in the restoration process is to estimate the degradation parameters. The degradation parameters include Blurring Function (Point Spread Function, PSF) and Noise Function. The most common causes of image degradation are errors in transmission channels, defects in the optical system, inhomogeneous medium, relative motion between object and camera, etc. In our research, a novel algorithm was adopted based on Circular Hough Transform used to estimate the width (radius, sigma) of the Point Spread Function. This algorithm is based o
... Show MoreIn this article, we aim to define a universal set consisting of the subscripts of the fuzzy differential equation (5) except the two elements and , subsets of that universal set are defined according to certain conditions. Then, we use the constructed universal set with its subsets for suggesting an analytical method which facilitates solving fuzzy initial value problems of any order by using the strongly generalized H-differentiability. Also, valid sets with graphs for solutions of fuzzy initial value problems of higher orders are found.
The usage of remote sensing techniques in managing and monitoring the environmental areas is increasing due to the improvement of the sensors used in the observation satellites around the earth. Resolution merge process is used to combine high resolution one band image with another one that have low resolution multi bands image to produce one image that is high in both spatial and spectral resolution. In this work different merging methods were tested to evaluate their enhancement capabilities to extract different environmental areas; Principle component analysis (PCA), Brovey, modified (Intensity, Hue ,Saturation) method and High Pass Filter methods were tested and subjected to visual and statistical comparison for evaluation. Both visu
... Show MoreIn this research we will present the signature as a key to the biometric authentication technique. I shall use moment invariants as a tool to make a decision about any signature which is belonging to the certain person or not. Eighteen voluntaries give 108 signatures as a sample to test the proposed system, six samples belong to each person were taken. Moment invariants are used to build a feature vector stored in this system. Euclidean distance measure used to compute the distance between the specific signatures of persons saved in this system and with new sample acquired to same persons for making decision about the new signature. Each signature is acquired by scanner in jpg format with 300DPI. Matlab used to implement this system.
The reaction of LAs-Cl8 : [ (2,2- (1-(3,4-bis(carboxylicdichloromethoxy)-5-oxo-2,5- dihydrofuran-2-yl)ethane – 1,2-diyl)bis(2,2-dichloroacetic acid)]with sodium azide in ethanol with drops of distilled water has been investigated . The new product L-AZ :(3Z ,5Z,8Z)-2- azido-8-[azido(3Z,5Z)-2-azido-2,6-bis(azidocarbonyl)-8,9-dihydro-2H-1,7-dioxa-3,4,5- triazonine-9-yl]methyl]-9-[(1-azido-1-hydroxy)methyl]-2H-1,7-dioxa-3,4,5-triazonine – 2,6 – dicarbonylazide was isolated and characterized by elemental analysis (C.H.N) , 1H-NMR , Mass spectrum and Fourier transform infrared spectrophotometer (FT-IR) . The reaction of the L-AZ withM+n: [ ( VO(II) , Cr(III) ,Mn(II) , Co(II) , Ni(II) , Cu(II) , Zn(II) , Cd(II) and Hg(II)] has been i
... Show MoreExamination of skewness makes academics more aware of the importance of accurate statistical analysis. Undoubtedly, most phenomena contain a certain percentage of skewness which resulted to the appearance of what is -called "asymmetry" and, consequently, the importance of the skew normal family . The epsilon skew normal distribution ESN (μ, σ, ε) is one of the probability distributions which provide a more flexible model because the skewness parameter provides the possibility to fluctuate from normal to skewed distribution. Theoretically, the estimation of linear regression model parameters, with an average error value that is not zero, is considered a major challenge due to having difficulties, as no explicit formula to calcula
... Show MoreData compression offers an attractive approach to reducing communication costs using available bandwidth effectively. It makes sense to pursue research on developing algorithms that can most effectively use available network. It is also important to consider the security aspect of the data being transmitted is vulnerable to attacks. The basic aim of this work is to develop a module for combining the operation of compression and encryption on the same set of data to perform these two operations simultaneously. This is achieved through embedding encryption into compression algorithms since both cryptographic ciphers and entropy coders bear certain resemblance in the sense of secrecy. First in the secure compression module, the given text is p
... Show MoreThis paper presents a numerical scheme for solving nonlinear time-fractional differential equations in the sense of Caputo. This method relies on the Laplace transform together with the modified Adomian method (LMADM), compared with the Laplace transform combined with the standard Adomian Method (LADM). Furthermore, for the comparison purpose, we applied LMADM and LADM for solving nonlinear time-fractional differential equations to identify the differences and similarities. Finally, we provided two examples regarding the nonlinear time-fractional differential equations, which showed that the convergence of the current scheme results in high accuracy and small frequency to solve this type of equations.