Data hiding is the process of encoding extra information in an image by making small modification to its pixels. To be practical, the hidden data must be perceptually invisible yet robust to common signal processing operations. This paper introduces a scheme for hiding a signature image that could be as much as 25% of the host image data and hence could be used both in digital watermarking as well as image/data hiding. The proposed algorithm uses orthogonal discrete wavelet transforms with two zero moments and with improved time localization called discrete slantlet transform for both host and signature image. A scaling factor ? in frequency domain control the quality of the watermarked images. Experimental results of signature image recovery after applying JPEG coding to the watermarking image are included.
This work is devoted to define new generalized gamma and beta functions involving the recently suggested seven-parameter Mittag-Leffler function, followed by a review of all related special cases. In addition, necessary investigations are affirmed for the new generalized beta function, including, Mellin transform, differential formulas, integral representations, and essential summation relations. Furthermore, crucial statistical application has been realized for the new generalized beta function.
Time series analysis is the statistical approach used to analyze a series of data. Time series is the most popular statistical method for forecasting, which is widely used in several statistical and economic applications. The wavelet transform is a powerful mathematical technique that converts an analyzed signal into a time-frequency representation. The wavelet transform method provides signal information in both the time domain and frequency domain. The aims of this study are to propose a wavelet function by derivation of a quotient from two different Fibonacci coefficient polynomials, as well as a comparison between ARIMA and wavelet-ARIMA. The time series data for daily wind speed is used for this study. From the obtained results, the
... Show MoreGrabisch and Labreuche have recently proposed a generalization of capacities, called the bi-capacities. Recently, a new approach for studying bi-capacities through introducing a notion of ternary-element sets proposed by the author. In this paper, we propose many results such as bipolar Mobius transform, importance index, and interaction index of bi-capacities based on our approach.
This paper introduced a hybrid technique for lossless image compression of natural and medical images; it is based on integrating the bit plane slicing and Wavelet transform along with a mixed polynomial of linear and non linear base. The experiments showed high compression performance with fully grunted reconstruction.
This work is aimed to design a system which is able to diagnose two types of tumors in a human brain (benign and malignant), using curvelet transform and probabilistic neural network. Our proposed method follows an approach in which the stages are preprocessing using Gaussian filter, segmentation using fuzzy c-means and feature extraction using curvelet transform. These features are trained and tested the probabilistic neural network. Curvelet transform is to extract the feature of MRI images. The proposed screening technique has successfully detected the brain cancer from MRI images of an almost 100% recognition rate accuracy.
This work presents plants recognition system with rotation invariant based on plant leaf. Wavelet energy features are extracted for sub-images (blocks) beside three of leaf shape features: [area, perimeter, circularity ratio]. (8) species of leaves are used in different size and color, (15) samples for each leaf are used. Leaves images are rotated at angles: 90˚, 180˚, 270˚(counterclockwise,clockwise). Euclidean distance is used, the recognition rate was 98.2% with/without rotation.
The past years have seen a rapid development in the area of image compression techniques, mainly due to the need of fast and efficient techniques for storage and transmission of data among individuals. Compression is the process of representing the data in a compact form rather than in its original or incompact form. In this paper, integer implementation of Arithmetic Coding (AC) and Discreet Cosine Transform (DCT) were applied to colored images. The DCT was applied using the YCbCr color model. The transformed image was then quantized with the standard quantization tables for luminance and chrominance. The quantized coefficients were scanned by zigzag scan and the output was encoded using AC. The results showed a decent compression ratio
... Show MoreRecent advances in wireless communication systems have made use of OFDM technique to achieve high data rate transmission. The sensitivity to frequency offset between the carrier frequencies of the transmitter and the receiver is one of the major problems in OFDM systems. This frequency offset introduces inter-carrier interference in the OFDM symbol and then the BER performance reduced. In this paper a Multi-Orthogonal-Band MOB-OFDM system based on the Discrete Hartley Transform (DHT) is proposed to improve the BER performance. The OFDM spectrum is divided into equal sub-bands and the data is divided between these bands to form a local OFDM symbol in each sub-band using DHT. The global OFDM symbol is formed from all sub-bands together using
... Show MoreThe analysis of the hyperlink structure of the web has led to significant improvements in web information retrieval. This survey study evaluates and analyzes relevant research publications on link analysis in web information retrieval utilizing diverse methods. These factors include the research year, the aims of the research article, the algorithms utilized to complete their study, and the findings received after using the algorithms. The findings revealed that Page Rank, Weighted Page Rank, and Weighted Page Content Rank are extensively employed by academics to properly analyze hyperlinks in web information retrieval. Finally, this paper analyzes the previous studies.
Information pollution is regarded as a big problem facing journalists working in the editing section, whereby journalistic materials face such pollution through their way across the editing pyramid. This research is an attempt to define the concept of journalistic information pollution, and what are the causes and sources of this pollution. The research applied the descriptive research method to achieve its objectives. A questionnaire was used to collect data. The findings indicate that journalists are aware of the existence of information pollution in journalism, and this pollution has its causes and resources.