Akaike’s Information Criterion (AIC) is a popular method for estimation the number of sources impinging on an array of sensors, which is a problem of great interest in several applications. The performance of AIC degrades under low Signal-to-Noise Ratio (SNR). This paper is concerned with the development and application of quadrature mirror filters (QMF) for improving the performance of AIC. A new system is proposed to estimate the number of sources by applying AIC to the outputs of filter bank consisting quadrature mirror filters (QMF). The proposed system can estimate the number of sources under low signal-to-noise ratio (SNR).
Producing pseudo-random numbers (PRN) with high performance is one of the important issues that attract many researchers today. This paper suggests pseudo-random number generator models that integrate Hopfield Neural Network (HNN) with fuzzy logic system to improve the randomness of the Hopfield Pseudo-random generator. The fuzzy logic system has been introduced to control the update of HNN parameters. The proposed model is compared with three state-ofthe-art baselines the results analysis using National Institute of Standards and Technology (NIST) statistical test and ENT test shows that the projected model is statistically significant in comparison to the baselines and this demonstrates the competency of neuro-fuzzy based model to produce
... Show MoreIn this paper, wireless network is planned; the network is predicated on the IEEE 802.16e standardization by WIMAX. The targets of this paper are coverage maximizing, service and low operational fees. WIMAX is planning through three approaches. In approach one; the WIMAX network coverage is major for extension of cell coverage, the best sites (with Band Width (BW) of 5MHz, 20MHZ per sector and four sectors per each cell). In approach two, Interference analysis in CNIR mode. In approach three of the planning, Quality of Services (QoS) is tested and evaluated. ATDI ICS software (Interference Cancellation System) using to perform styling. it shows results in planning area covered 90.49% of the Baghdad City and used 1000 mob
... Show MoreThe sending of information at the present time requires the speed and providing protection for it. So compression of the data is used in order to provide speed and encryption is used in order to provide protection. In this paper a proposed method is presented in order to provide compression and security for the secret information before sending it. The proposed method based on especial keys with MTF transform method to provide compression and based on RNA coding with MTF encoding method to provide security. The proposed method based on multi secret keys. Every key is designed in an especial way. The main reason in designing these keys in special way is to protect these keys from the predication of the unauthorized users.
This work aims to develop a secure lightweight cipher algorithm for constrained devices. A secure communication among constrained devices is a critical issue during the data transmission from the client to the server devices. Lightweight cipher algorithms are defined as a secure solution for constrained devices that require low computational functions and small memory. In contrast, most lightweight algorithms suffer from the trade-off between complexity and speed in order to produce robust cipher algorithm. The PRESENT cipher has been successfully experimented on as a lightweight cryptography algorithm, which transcends other ciphers in terms of its computational processing that required low complexity operations. The mathematical model of
... Show MoreIn recent years, the iris biometric occupies a wide interesting when talking about
biometric based systems, because it is one of the most accurate biometrics to prove
users identities, thus it is providing high security for concerned systems. This
research article is showing up an efficient method to detect the outer boundary of
the iris, using a new form of leading edge detection technique. This technique is
very useful to isolate two regions that have convergent intensity levels in gray scale
images, which represents the main issue of iris isolation, because it is difficult to
find the border that can separate between the lighter gray background (sclera) and
light gray foreground (iris texture). The proposed met
Image fusion is one of the most important techniques in digital image processing, includes the development of software to make the integration of multiple sets of data for the same location; It is one of the new fields adopted in solve the problems of the digital image, and produce high-quality images contains on more information for the purposes of interpretation, classification, segmentation and compression, etc. In this research, there is a solution of problems faced by different digital images such as multi focus images through a simulation process using the camera to the work of the fuse of various digital images based on previously adopted fusion techniques such as arithmetic techniques (BT, CNT and MLT), statistical techniques (LMM,
... Show MoreLike the digital watermark, which has been highlighted in previous studies, the quantum watermark aims to protect the copyright of any image and to validate its ownership using visible or invisible logos embedded in the cover image. In this paper, we propose a method to include an image logo in a cover image based on quantum fields, where a certain amount of texture is encapsulated to encode the logo image before it is included in the cover image. The method also involves transforming wavelets such as Haar base transformation and geometric transformation. These combination methods achieve a high degree of security and robustness for watermarking technology. The digital results obtained from the experiment show that the values of Peak Sig
... Show MoreIn this paper, an algorithm through which we can embed more data than the
regular methods under spatial domain is introduced. We compressed the secret data
using Huffman coding and then this compressed data is embedded using laplacian
sharpening method.
We used Laplace filters to determine the effective hiding places, then based on
threshold value we found the places with the highest values acquired from these filters
for embedding the watermark. In this work our aim is increasing the capacity of
information which is to be embedded by using Huffman code and at the same time
increasing the security of the algorithm by hiding data in the places that have highest
values of edges and less noticeable.
The perform
Data hiding (Steganography) is a method used for data security purpose and to protect the data during its transmission. Steganography is used to hide the communication between two parties by embedding a secret message inside another cover (audio, text, image or video). In this paper a new text Steganography method is proposed that based on a parser and the ASCII of non-printed characters to hide the secret information in the English cover text after coding the secret message and compression it using modified Run Length Encoding method (RLE). The proposed method achieved a high capacity ratio for Steganography (five times more than the cover text length) when compared with other methods, and provides a 1.0 transparency by depending on som
... Show MoreBecause of vulnerable threats and attacks against database during transmission from sender to receiver, which is one of the most global security concerns of network users, a lightweight cryptosystem using Rivest Cipher 4 (RC4) algorithm is proposed. This cryptosystem maintains data privacy by performing encryption of data in cipher form and transfers it over the network and again performing decryption to original data. Hens, ciphers represent encapsulating system for database tables