Image compression is very important in reducing the costs of data storage transmission in relatively slow channels. Wavelet transform has received significant attention because their multiresolution decomposition that allows efficient image analysis. This paper attempts to give an understanding of the wavelet transform using two more popular examples for wavelet transform, Haar and Daubechies techniques, and make compression between their effects on the image compression.
JPEG is most popular image compression and encoding, this technique is widely used in many applications (images, videos and 3D animations). Meanwhile, researchers are very interested to develop this massive technique to compress images at higher compression ratios with keeping image quality as much as possible. For this reason in this paper we introduce a developed JPEG based on fast DCT and removed most of zeros and keeps their positions in a transformed block. Additionally, arithmetic coding applied rather than Huffman coding. The results showed up, the proposed developed JPEG algorithm has better image quality than traditional JPEG techniques.

In this paper we present a method to analyze five types with fifteen wavelet families for eighteen different EMG signals. A comparison study is also given to show performance of various families after modifying the results with back propagation Neural Network. This is actually will help the researchers with the first step of EMG analysis. Huge sets of results (more than 100 sets) are proposed and then classified to be discussed and reach the final.
It is well known that sonography is not the first choice in detecting early breast tumors. Improving the resolution of breast sonographic image is the goal of many workers to make sonography a first choice examination as it is safe and easy procedure as well as cost effective. In this study, infrared light exposure of breast prior to ultrasound examination was implemented to see its effect on resolution of sonographic image. Results showed that significant improvement was obtained in 60% of cases.
The Dirichlet process is an important fundamental object in nonparametric Bayesian modelling, applied to a wide range of problems in machine learning, statistics, and bioinformatics, among other fields. This flexible stochastic process models rich data structures with unknown or evolving number of clusters. It is a valuable tool for encoding the true complexity of real-world data in computer models. Our results show that the Dirichlet process improves, both in distribution density and in signal-to-noise ratio, with larger sample size; achieves slow decay rate to its base distribution; has improved convergence and stability; and thrives with a Gaussian base distribution, which is much better than the Gamma distribution. The performance depen
... Show More