Fractal image compression depends on representing an image using affine transformations. The main concern for researches in the discipline of fractal image compression (FIC) algorithm is to decrease encoding time needed to compress image data. The basic technique is that each portion of the image is similar to other portions of the same image. In this process, there are many models that were developed. The presence of fractals was initially noticed and handled using Iterated Function System (IFS); that is used for encoding images. In this paper, a review of fractal image compression is discussed with its variants along with other techniques. A summarized review of contributions is achieved to determine the fulfillment of fractal image compression, specifically for the block indexing methods based on the moment descriptor. Block indexing method depends on classifying the domain and range blocks using moments to generate an invariant descriptor that reduces the long encoding time. A comparison is performed between the blocked indexing technology and other fractal image techniques to determine the importance of block indexing in saving encoding time and achieving better compression ratio while maintaining image quality on Lena image.
In this paper, a compact multiband printed dipole antenna is presented as a candidate for use in wireless communication applications. The proposed fractal antenna design is based on the second level tent transformation. The space-filling property of this fractal geometry permits producing longer lengths in a more compact size. Theoretical performance of this antenna has been calculated using the commercially available software IE3D from Zeland Software Inc. This electromagnetic simulator is based on the method of moments (MoM). The proposed dipole antenna has been found to possess a considerable size reduction compared with the conventional printed or wire dipole antenna designed at the same design frequency and using the same substrate
... Show MoreSome problems want to be solved in image compression to make the process workable and more efficient. Much work had been done in the field of lossy image compression based on wavelet and Discrete Cosine Transform (DCT). In this paper, an efficient image compression scheme is proposed, based on a common encoding transform scheme; It consists of the following steps: 1) bi-orthogonal (tab 9/7) wavelet transform to split the image data into sub-bands, 2) DCT to de-correlate the data, 3) the combined transform stage's output is subjected to scalar quantization before being mapped to positive, 4) and LZW encoding to produce the compressed data. The peak signal-to-noise (PSNR), compression ratio (CR), and compression gain (CG) measures were used t
... Show MoreElectrocardiogram (ECG) is an important physiological signal for cardiac disease diagnosis. With the increasing use of modern electrocardiogram monitoring devices that generate vast amount of data requiring huge storage capacity. In order to decrease storage costs or make ECG signals suitable and ready for transmission through common communication channels, the ECG data
volume must be reduced. So an effective data compression method is required. This paper presents an efficient technique for the compression of ECG signals. In this technique, different transforms have been used to compress the ECG signals. At first, a 1-D ECG data was segmented and aligned to a 2-D data array, then 2-D mixed transform was implemented to compress the
The data compression is a very important process in order to reduce the size of a large data to be stored or transported, parametric curves such that Bezier curve is a suitable method to return gradual change and mutability of this data. Ridghelet transform solve the problems in the wavelet transform and it can compress the image well but when it uses with Bezier curve, the equality of compressed image become very well. In this paper, a new compression method is proposed by using Bezier curve with Ridgelet transform on RGB images. The results showed that the proposed method present good performance in both subjective and objective experiments. When the PSNR values equal to (34.2365, 33.4323 and 33.0987), they were increased in the propos
... Show MoreIn this study, an analysis of re-using the JPEG lossy algorithm on the quality of satellite imagery is presented. The standard JPEG compression algorithm is adopted and applied using Irfan view program, the rang of JPEG quality that used is 50-100.Depending on the calculated satellite image quality variation, the maximum number of the re-use of the JPEG lossy algorithm adopted in this study is 50 times. The image quality degradation to the JPEG quality factor and the number of re-use of the JPEG algorithm to store the satellite image is analyzed.
Acute appendicitis is the most common surgical abdominal emergency. Its clinical diagnosis remains a challenge to surgeons, so different imaging options were introduced to improve diagnostic accuracy. Among these imaging modality choices, diagnostic medical sonography (DMS) is a simple, easily available, and cost effective clinical tool. The purpose of this study was to assess the accuracy of DMS, in the diagnosis of acute appendicitis compared to the histopathology report, as a gold standard. Between May 2015 and May 2016, 215 patients with suspected appendicitis were examined with DMS. The DMS findings were recorded as positive and negative for acute appendicitis and compared with the histopathological results, as a gold standard
... Show MoreThe ability to produce load-bearing masonry units adopting ACI 211.1 mix design using (1:3.2:2.5) as (cement: fine aggregate: coarse aggregate) with slump range (25-50mm) which can conform (dimension, absorption, and compressive strength) within IQS 1077/1987 requirements type A was our main goal of the study. The ability to use low cement content (300 kg/m3) to handle our market price products since the most consumption in wall construction for low-cost buildings was encouraging. The use of (10 and 20%) of LECA as partial volume replacement of coarse aggregate to reduce the huge weight of masonry blocks can also be recommended. The types of production of the load-bearing masonry units were A and B for (
... Show MoreIn this paper, the goal of proposed method is to protect data against different types of attacks by unauthorized parties. The basic idea of proposed method is generating a private key from a specific features of digital color image such as color (Red, Green and Blue); the generating process of private key from colors of digital color image performed via the computing process of color frequencies for blue color of an image then computing the maximum frequency of blue color, multiplying it by its number and adding process will performed to produce a generated key. After that the private key is generated, must be converting it into the binary representation form. The generated key is extracted from blue color of keyed image then we selects a c
... Show MoreThe main challenge is to protect the environment from future deterioration due to pollution and the lack of natural resources. Therefore, one of the most important things to pay attention to and get rid of its negative impact is solid waste. Solid waste is a double-edged sword according to the way it is dealt with, as neglecting it causes a serious environmental risk from water, air and soil pollution, while dealing with it in the right way makes it an important resource in preserving the environment. Accordingly, the proper management of solid waste and its reuse or recycling is the most important factor. Therefore, attention has been drawn to the use of solid waste in different ways, and the most common way is to use it as an alternative
... Show MoreA new method presented in this work to detect the existence of hidden
data as a secret message in images. This method must be applyied only on images which have the same visible properties (similar in perspective) where the human eyes cannot detect the difference between them.
This method is based on Image Quality Metrics (Structural Contents
Metric), which means the comparison between the original images and stego images, and determines the size ofthe hidden data. We applied the method to four different images, we detect by this method the hidden data and find exactly the same size of the hidden data.