JPEG is most popular image compression and encoding, this technique is widely used in many applications (images, videos and 3D animations). Meanwhile, researchers are very interested to develop this massive technique to compress images at higher compression ratios with keeping image quality as much as possible. For this reason in this paper we introduce a developed JPEG based on fast DCT and removed most of zeros and keeps their positions in a transformed block. Additionally, arithmetic coding applied rather than Huffman coding. The results showed up, the proposed developed JPEG algorithm has better image quality than traditional JPEG techniques.
In this paper, a fast lossless image compression method is introduced for compressing medical images, it is based on splitting the image blocks according to its nature along with using the polynomial approximation to decompose image signal followed by applying run length coding on the residue part of the image, which represents the error caused by applying polynomial approximation. Then, Huffman coding is applied as a last stage to encode the polynomial coefficients and run length coding. The test results indicate that the suggested method can lead to promising performance.
A new algorithm is proposed to compress speech signals using wavelet transform and linear predictive coding. Signal compression based on the concept of selecting a small number of approximation coefficients after they are compressed by the wavelet decomposition (Haar and db4) at a suitable chosen level and ignored details coefficients, and then approximation coefficients are windowed by a rectangular window and fed to the linear predictor. Levinson Durbin algorithm is used to compute LP coefficients, reflection coefficients and predictor error. The compress files contain LP coefficients and previous sample. These files are very small in size compared to the size of the original signals. Compression ratio is calculated from the size of th
... Show Moreتعد مجالات الصورة وعلاماتها الحركية حضوراً دلالياً للاتصال العلامي واتساعاً في الرابطة الجدلية ما بين الدوال ومداليها، التي تقوم بها الرؤية الاخراجية لإنتاج دلالات اخفائية تمتلك جوهرها الانتقالي عبر الافكار بوصفها معطيات العرض، ويسعى التشفير الصوري الى بث ثنائية المعنى داخل الحقول المتعددة للعرض المسرحي، ولفهم المعنى المنبثق من هذه التشفيرات البصرية، تولدت الحاجة لبحث تشكيل هذه التشفيرات وكيفية تح
... Show MoreWA Shukur, FA Abdullatif, Ibn Al-Haitham Journal For Pure and Applied Sciences, 2011 With wide spread of internet, and increase the price of information, steganography become very important to communication. Over many years used different types of digital cover to hide information as a cover channel, image from important digital cover used in steganography because widely use in internet without suspicious.
The study discusses the marketing profile of electoral candidates and politicians especially the image that takes root in the minds of voters has become more important than the ideologies in the technological era or their party affiliations and voters are no longer paying attention to the concepts of a liberal, conservative, right-wing or secular, etc. while their interests have increased towards candidates. The consultants and image experts are able to make a dramatic shift in their electoral roles. They, as specialists in the electoral arena, dominate the roles of political parties.
The importance of the study comes from the fact that the image exceeds its normal framework in our contemporary world to become political and cultural
The convergence speed is the most important feature of Back-Propagation (BP) algorithm. A lot of improvements were proposed to this algorithm since its presentation, in order to speed up the convergence phase. In this paper, a new modified BP algorithm called Speeding up Back-Propagation Learning (SUBPL) algorithm is proposed and compared to the standard BP. Different data sets were implemented and experimented to verify the improvement in SUBPL.
The variation of compression index Cc and swelling index Cs with the degree of saturation S was studied on unsaturated and fully saturated soils for different degrees of saturation (100%, 91%, 85%, 75%, 60%), several mathematical equations were found to describe these relationships, these equations can be used to predict settlement during the consolidation process in unsaturated and fully saturated soils.
In this paper, we investigate the automatic recognition of emotion in text. We perform experiments with a new method of classification based on the PPM character-based text compression scheme. These experiments involve both coarse-grained classification (whether a text is emotional or not) and also fine-grained classification such as recognising Ekman’s six basic emotions (Anger, Disgust, Fear, Happiness, Sadness, Surprise). Experimental results with three datasets show that the new method significantly outperforms the traditional word-based text classification methods. The results show that the PPM compression based classification method is able to distinguish between emotional and nonemotional text with high accuracy, between texts invo
... Show MoreGenerally, direct measurement of soil compression index (Cc) is expensive and time-consuming. To save time and effort, indirect methods to obtain Cc may be an inexpensive option. Usually, the indirect methods are based on a correlation between some easier measuring descriptive variables such as liquid limit, soil density, and natural water content. This study used the ANFIS and regression methods to obtain Cc indirectly. To achieve the aim of this investigation, 177 undisturbed samples were collected from the cohesive soil in Sulaymaniyah Governorate in Iraq. Results of this study indicated that ANFIS models over-performed the Regression method in estimating Cc with R2 of 0.66 and 0.48 for both ANFIS and Regre
... Show MoreThere are many methods of searching large amount of data to find one particular piece of information. Such as find name of person in record of mobile. Certain methods of organizing data make the search process more efficient the objective of these methods is to find the element with least cost (least time). Binary search algorithm is faster than sequential and other commonly used search algorithms. This research develops binary search algorithm by using new structure called Triple, structure in this structure data are represented as triple. It consists of three locations (1-Top, 2-Left, and 3-Right) Binary search algorithm divide the search interval in half, this process makes the maximum number of comparisons (Average case com
... Show More