Image compression is very important in reducing the costs of data storage transmission in relatively slow channels. Wavelet transform has received significant attention because their multiresolution decomposition that allows efficient image analysis. This paper attempts to give an understanding of the wavelet transform using two more popular examples for wavelet transform, Haar and Daubechies techniques, and make compression between their effects on the image compression.
Thin-walled members are increasingly used in structural applications, especially in light structures like in constructions and aircraft structures because of their high strength-to-weight ratio. Perforations are often made on these structures for reducing weight and to facilitate the services and maintenance works like in aircraft wing ribs. This type of structures suffers from buckling phenomena due to its dimensions, and this suffering increases with the presence of holes in it. This study investigated experimentally and numerically the buckling behavior of aluminum alloy 6061-O thin-walled lipped channel beam with specific holes subjected to compression load. A nonlinear finite elements analysis was used to obtain the
... Show MoreThe multi-focus image fusion method can fuse more than one focused image to generate a single image with more accurate description. The purpose of image fusion is to generate one image by combining information from many source images of the same scene. In this paper, a multi-focus image fusion method is proposed with a hybrid pixel level obtained in the spatial and transform domains. The proposed method is implemented on multi-focus source images in YCbCr color space. As the first step two-level stationary wavelet transform was applied on the Y channel of two source images. The fused Y channel is implemented by using many fusion rule techniques. The Cb and Cr channels of the source images are fused using principal component analysis (PCA).
... Show MoreIn this study, an analysis of re-using the JPEG lossy algorithm on the quality of satellite imagery is presented. The standard JPEG compression algorithm is adopted and applied using Irfan view program, the rang of JPEG quality that used is 50-100.Depending on the calculated satellite image quality variation, the maximum number of the re-use of the JPEG lossy algorithm adopted in this study is 50 times. The image quality degradation to the JPEG quality factor and the number of re-use of the JPEG algorithm to store the satellite image is analyzed.
A watermark is a pattern or image defined in a paper that seems as different shades of light/darkness when viewed by the transmitted light which used for improving the robustness and security. There are many ways to work Watermark, including the addition of an image or text to the original image, but in this paper was proposed another type of watermark is add curves, line or forms have been drawn by interpolation, which produces watermark difficult to falsify and manipulate it. Our work suggests new techniques of watermark images which is embedding Cubic-spline interpolation inside the image using Bit Plane Slicing. The Peak to Signal Noise Ratio (PSNR) and Mean Square Error (MSE) value is calculated so that the quality of the original i
... Show MoreThe advent of UNHCR reports has given rise to the uniqueness of its distinctive way of image representation and using semiotic features. So, there are a lot of researches that have investigated UNHCR reports, but no research has examined images in UNHCR reports of displaced Iraqis from a multimodal discourse perspective. The present study suggests that the images are, like language, rich in many potential meanings and are governed by clearly visual grammar structures that can be employed to decode these multiple meanings. Seven images are examined in terms of their representational, interactional and compositional aspects. Depending on the results, this study concludes that the findings support the visual grammar theory and highlight the va
... Show MoreDatabase is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show More