In recent years, the iris biometric occupies a wide interesting when talking about
biometric based systems, because it is one of the most accurate biometrics to prove
users identities, thus it is providing high security for concerned systems. This
research article is showing up an efficient method to detect the outer boundary of
the iris, using a new form of leading edge detection technique. This technique is
very useful to isolate two regions that have convergent intensity levels in gray scale
images, which represents the main issue of iris isolation, because it is difficult to
find the border that can separate between the lighter gray background (sclera) and
light gray foreground (iris texture). The proposed method tried to find iris radius by
seeking in the two iris halves (right and left) circularly, in term of certain angles
interval for each half, to avoid the existence of the upper and lower eyelids and
eyelashes. After the two radiuses (i.e. for each half) had been determined, the iris
final iris radius would be evaluated to the minimum value of them. This method
tested on all samples of CASIAv4-Interval dataset, which consist of 2639 samples,
captured from 249 individuals, and distributed on 395 classes, the accuracy of the
testing was 100% for outer boundary detection.
Pattern matching algorithms are usually used as detecting process in intrusion detection system. The efficiency of these algorithms is affected by the performance of the intrusion detection system which reflects the requirement of a new investigation in this field. Four matching algorithms and a combined of two algorithms, for intrusion detection system based on new DNA encoding, are applied for evaluation of their achievements. These algorithms are Brute-force algorithm, Boyer-Moore algorithm, Horspool algorithm, Knuth-Morris-Pratt algorithm, and the combined of Boyer-Moore algorithm and Knuth–Morris– Pratt algorithm. The performance of the proposed approach is calculated based on the executed time, where these algorithms are applied o
... Show MoreIntrusion detection systems detect attacks inside computers and networks, where the detection of the attacks must be in fast time and high rate. Various methods proposed achieved high detection rate, this was done either by improving the algorithm or hybridizing with another algorithm. However, they are suffering from the time, especially after the improvement of the algorithm and dealing with large traffic data. On the other hand, past researches have been successfully applied to the DNA sequences detection approaches for intrusion detection system; the achieved detection rate results were very low, on other hand, the processing time was fast. Also, feature selection used to reduce the computation and complexity lead to speed up the system
... Show MoreThis paper presents a new RGB image encryption scheme using multi chaotic maps. Encrypting an image is performed via chaotic maps to confirm the properties of secure cipher namely confusion and diffusion are satisfied. Also, the key sequence for encrypting an image is generated using a combination of 1D logistic and Sine chaotic maps. Experimental results and the compassion results indicate that the suggested scheme provides high security against several types of attack, large secret keyspace and highly sensitive.
In recent years, encryption technology has been developed rapidly and many image encryption methods have been put forward. The chaos-based image encryption technique is a modern encryption system for images. To encrypt images, it uses random sequence chaos, which is an efficient way to solve the intractable problem of simple and highly protected image encryption. There are, however, some shortcomings in the technique of chaos-based image encryption, such limited accuracy issue. The approach focused on the chaotic system in this paper is to construct a dynamic IP permutation and S-Box substitution by following steps. First of all, use of a new IP table for more diffusion of al
... Show MoreEarthquakes occur on faults and create new faults. They also occur on normal, reverse and strike-slip faults. The aim of this work is to suggest a new unified classification of Shallow depth earthquakes based on the faulting styles, and to characterize each class. The characterization criteria include the maximum magnitude, focal depth, b-constant value, return period and relations between magnitude, focal depth and dip of fault plane. Global Centroid Moment Tensor (GCMT) catalog is the source of the used data. This catalog covers the period from Jan.1976 to Dec. 2017. We selected only the shallow (depth less than 70kms) pure, normal, strike-slip and reverse earthquakes (magnitude ≥ 5) and excluded the oblique earthquakes. Th
... Show MoreAny software application can be divided into four distinct interconnected domains namely, problem domain, usage domain, development domain and system domain. A methodology for assistive technology software development is presented here that seeks to provide a framework for requirements elicitation studies together with their subsequent mapping implementing use-case driven object-oriented analysis for component based software architectures. Early feedback on user interface components effectiveness is adopted through process usability evaluation. A model is suggested that consists of the three environments; problem, conceptual, and representational environments or worlds. This model aims to emphasize on the relationship between the objects
... Show MoreThis study proposed a biometric-based digital signature scheme proposed for facial recognition. The scheme is designed and built to verify the person’s identity during a registration process and retrieve their public and private keys stored in the database. The RSA algorithm has been used as asymmetric encryption method to encrypt hashes generated for digital documents. It uses the hash function (SHA-256) to generate digital signatures. In this study, local binary patterns histograms (LBPH) were used for facial recognition. The facial recognition method was evaluated on ORL faces retrieved from the database of Cambridge University. From the analysis, the LBPH algorithm achieved 97.5% accuracy; the real-time testing was done on thirty subj
... Show MoreIn this work, a pollution-sensitive Photonic Crystal Fiber (PCF) based on Surface Plasmon Resonance (SPR) technology is designed and implemented for sensing refractive indices and concentrations of polluted water . The overall construction of the sensor is achieved by splicing short lengths of PCF (ESM-12) solid core on one side with traditional multimode fiber (MMF) and depositing a gold nanofilm of 50nm thickness on the end of the PCF sensor. The PCF- SPR experiment was carried out with various samples of polluted water including(distilled water, draining water, dirty pond water, chemical water, salty water and oiled water). The location of the resonant wavelength peaks is seen to move to longer wavelengths (red shift)
... Show MoreAbstract:
Background: The alteration of bowel habits, bleeding per-rectum and anemia were common features in both groups in this study, but in young patients there was a delay of 6 months between the presenting symptoms and the definitive diagnosis because the disease was not suspected and investigated in them. The most common site for the tumors in young patients was the rectum and in patients above the age of 40 years was the Sigmoid.
The pathological finding showed that classification of the colorectal tumors in young patients appear moderately to poorly differentiated adenocarcinoma , this indicate a more malignant course of the disease in young patients.
This study sen
... Show MoreOrthogonal polynomials and their moments serve as pivotal elements across various fields. Discrete Krawtchouk polynomials (DKraPs) are considered a versatile family of orthogonal polynomials and are widely used in different fields such as probability theory, signal processing, digital communications, and image processing. Various recurrence algorithms have been proposed so far to address the challenge of numerical instability for large values of orders and signal sizes. The computation of DKraP coefficients was typically computed using sequential algorithms, which are computationally extensive for large order values and polynomial sizes. To this end, this paper introduces a computationally efficient solution that utilizes the parall
... Show More