A security system can be defined as a method of providing a form of protection to any type of data. A sequential process must be performed in most of the security systems in order to achieve good protection. Authentication can be defined as a part of such sequential processes, which is utilized in order to verify the user permission to entree and utilize the system. There are several kinds of methods utilized, including knowledge, and biometric features. The electroencephalograph (EEG) signal is one of the most widely signal used in the bioinformatics field. EEG has five major wave patterns, which are Delta, Theta, Alpha, Beta and Gamma. Every wave has five features which are amplitude, wavelength, period, speed and frequency. The linear feedback shift-register(LFSR) is a widely used tool for generating cryptographic sequences like Galois, Fibonacci. In this paper authentication system is designed using the specified brain waves which are Gamma and Betta wave’s features in order to generate the user authenticated sequence using Galois LFSRs.
Photonic crystal fiber interferometers are used in many sensing applications. In this work, an in-reflection photonic crystal fiber (PCF) based on Mach-Zehnder (micro-holes collapsing) (MZ) interferometer, which exhibits high sensitivity to different volatile organic compounds (VOCs), without the needing of any permeable material. The interferometer is robust, compact, and consists of a stub photonic crystal fiber of large-mode area, photonic crystal fiber spliced to standard single mode fiber (SMF) (corning-28), this splicing occurs with optimized splice loss 0.19 dB In the splice regions the voids of the holey fiber are completely collapsed, which allows the excitation and recombination of core and cladding modes. The device reflection
... Show MoreExtractive multi-document text summarization – a summarization with the aim of removing redundant information in a document collection while preserving its salient sentences – has recently enjoyed a large interest in proposing automatic models. This paper proposes an extractive multi-document text summarization model based on genetic algorithm (GA). First, the problem is modeled as a discrete optimization problem and a specific fitness function is designed to effectively cope with the proposed model. Then, a binary-encoded representation together with a heuristic mutation and a local repair operators are proposed to characterize the adopted GA. Experiments are applied to ten topics from Document Understanding Conference DUC2002 datas
... Show MoreIn this work we present a detailed study on anisotype nGe-pSi heterojunction (HJ) used as photodetector in the wavelength range (500-1100 nm). I-V characteristics in the dark and under illumination, C-V characteristics, minority carriers lifetime (MCLT), spectral responsivity, field of view, and linearity were investigated at 300K. The results showed that the detector has maximum spectral responsivity at λ=950 nm. The photo-induced open circuit voltage decay results revealed that the MCLT of HJ was around 14.4 μs
Neuroimaging is a description, whether in two-dimensions (2D) or three-dimensions (3D), of the structure and functions of the brain. Neuroimaging provides a valuable diagnostic tool, in which a limited approach is used to create images of the focal sensory system by medicine professionals. For the clinical diagnosis of patients with Alzheimer's Disease (AD) or Mild Cognitive Impairs (MCI), the accurate identification of patients from normal control persons (NCs) is critical. Recently, numerous researches have been undertaken on the identification of AD based on neuroimaging data, including images with radiographs and algorithms for master learning. In the previous decade, these techniques were also used slowly to differentiate AD a
... Show MoreIn the present work, an image compression method have been modified by combining The Absolute Moment Block Truncation Coding algorithm (AMBTC) with a VQ-based image coding. At the beginning, the AMBTC algorithm based on Weber's law condition have been used to distinguish low and high detail blocks in the original image. The coder will transmit only mean of low detailed block (i.e. uniform blocks like background) on the channel instate of transmit the two reconstruction mean values and bit map for this block. While the high detail block is coded by the proposed fast encoding algorithm for vector quantized method based on the Triangular Inequality Theorem (TIE), then the coder will transmit the two reconstruction mean values (i.e. H&L)
... Show MoreMost tubes are made from butyl rubbers, but certain types, such as giant tubes, are based on natural rubber because very high green strength is required when handling the uncured compound. By using blends of natural rubber (NR) and brominated butyl rubber (BIIR), it is possible to maintain high green strength in the uncured compound and improve impermeability and heat resistance of the cured tube. The best formulations are obtained in the presence of 50 phr of (BIIR) to achieve desired mechanical properties. Improved impermeability was obtained by using 50 and 75 phr of (BIIR) rubber in compounds. Blending of brominated butyl rubber (BIIR) with natural rubber (NR) enhances air retention with acceptable sacrifices in green strength.
... Show MoreIn this paper, an adaptive polynomial compression technique is introduced of hard and soft thresholding of transformed residual image that efficiently exploited both the spatial and frequency domains, where the technique starts by applying the polynomial coding in the spatial domain and then followed by the frequency domain of discrete wavelet transform (DWT) that utilized to decompose the residual image of hard and soft thresholding base. The results showed the improvement of adaptive techniques compared to the traditional polynomial coding technique.
A principal problem of any internet user is the increasing number of spam, which became a great problem today. Therefore, spam filtering has become a research fo-cus that attracts the attention of several security researchers and practitioners. Spam filtering can be viewed as a two-class classification problem. To this end, this paper proposes a spam filtering approach based on Possibilistic c-Means (PCM) algorithm and weighted distance coined as (WFCM) that can efficiently distinguish between spam and legitimate email messages. The objective of the formulated fuzzy problem is to construct two fuzzy clusters: spam and email clusters. The weight assignment is set by information gain algorithm. Experimental results on spam based benchmark
... Show MoreThe secure data transmission over internet is achieved using Steganography. It is the art and science of concealing information in unremarkable cover media so as not to arouse an observer’s suspicion. In this paper the color cover image is divided into equally four parts, for each part select one channel from each part( Red, or Green, or Blue), choosing one of these channel depending on the high color ratio in that part. The chosen part is decomposing into four parts {LL, HL, LH, HH} by using discrete wavelet transform. The hiding image is divided into four part n*n then apply DCT on each part. Finally the four DCT coefficient parts embedding in four high frequency sub-bands {HH} in
... Show MoreAbstract
The objective of image fusion is to merge multiple sources of images together in such a way that the final representation contains higher amount of useful information than any input one.. In this paper, a weighted average fusion method is proposed. It depends on using weights that are extracted from source images using counterlet transform. The extraction method is done by making the approximated transformed coefficients equal to zero, then taking the inverse counterlet transform to get the details of the images to be fused. The performance of the proposed algorithm has been verified on several grey scale and color test images, and compared with some present methods.
... Show More