This paper presents a combination of enhancement techniques for fingerprint images affected by different type of noise. These techniques were applied to improve image quality and come up with an acceptable image contrast. The proposed method included five different enhancement techniques: Normalization, Histogram Equalization, Binarization, Skeletonization and Fusion. The Normalization process standardized the pixel intensity which facilitated the processing of subsequent image enhancement stages. Subsequently, the Histogram Equalization technique increased the contrast of the images. Furthermore, the Binarization and Skeletonization techniques were implemented to differentiate between the ridge and valley structures and to obtain one pixel-wide lines. Finally, the Fusion technique was used to merge the results of the Histogram Equalization process with the Skeletonization process to obtain the new high contrast images. The proposed method was tested in different quality images from National Institute of Standard and Technology (NIST) special database 14. The experimental results are very encouraging and the current enhancement method appeared to be effective by improving different quality images.
The free zone or the free economy cities are cities with classification and functional specificity, although the history of the concept of these areas has been It dates back to distant eras, but the intellectual and philosophical construction with the support of intellectual approaches, the most important of which is globalization contributed to its rapid spread globally and taking a variety of forms and models. With the diversity of its formulas and objectives countries have competed in adopting the establishment of these areas, meanwhile The influence of related trends affected the contemporary formation of these sites. Therefore ,the research was directed focus on the importance of adopting a set of common indicators (collection
... Show MoreThe fetal heart rate (FHR) signal processing based on Artificial Neural Networks (ANN),Fuzzy Logic (FL) and frequency domain Discrete Wavelet Transform(DWT) were analysis in order to perform automatic analysis using personal computers. Cardiotocography (CTG) is a primary biophysical method of fetal monitoring. The assessment of the printed CTG traces was based on the visual analysis of patterns that describing the variability of fetal heart rate signal. Fetal heart rate data of pregnant women with pregnancy between 38 and 40 weeks of gestation were studied. The first stage in the system was to convert the cardiotocograghy (CTG) tracing in to digital series so that the system can be analyzed ,while the second stage ,the FHR time series was t
... Show MoreAgriculture improvement is a national economic issue that extremely depends on productivity. The explanation of disease detection in plants plays a significant role in the agriculture field. Accurate prediction of the plant disease can help treat the leaf as early as possible, which controls the economic loss. This paper aims to use the Image processing techniques with Convolutional Neural Network (CNN). It is one of the deep learning techniques to classify and detect plant leaf diseases. A publicly available Plant village dataset was used, which consists of 15 classes, including 12 diseases classes and 3 healthy classes. The data augmentation techniques have been used. In addition to dropout and weight reg
... Show MoreAudio classification is the process to classify different audio types according to contents. It is implemented in a large variety of real world problems, all classification applications allowed the target subjects to be viewed as a specific type of audio and hence, there is a variety in the audio types and every type has to be treatedcarefully according to its significant properties.Feature extraction is an important process for audio classification. This workintroduces several sets of features according to the type, two types of audio (datasets) were studied. Two different features sets are proposed: (i) firstorder gradient feature vector, and (ii) Local roughness feature vector, the experimentsshowed that the results are competitive to
... Show MoreIn this paper we generalize some of the results due to Bell and Mason on a near-ring N admitting a derivation D , and we will show that the body of evidence on prime near-rings with derivations have the behavior of the ring. Our purpose in this work is to explore further this ring like behavior. Also, we show that under appropriate additional hypothesis a near-ring must be a commutative ring.
Let G be a graph, each edge e of which is given a weight w(e). The shortest path problem is a path of minimum weight connecting two specified vertices a and b, and from it we have a pre-topology. Furthermore, we study the restriction and separators in pre-topology generated by the shortest path problems. Finally, we study the rate of liaison in pre-topology between two subgraphs. It is formally shown that the new distance measure is a metric
Many of the key stream generators which are used in practice are LFSR-based in the sense that they produce the key stream according to a rule y = C(L(x)), where L(x) denotes an internal linear bit stream, produced by small number of parallel linear feedback shift registers (LFSRs), and C denotes some nonlinear compression function. In this paper we combine between the output sequences from the linear feedback shift registers with the sequences out from non linear key generator to get the final very strong key sequence
The present paper stresses the direct effect of the situational dimension termed as “reality” on the authors’ thoughts and attitudes. Every text is placed within a particular situation which has to be correctly identified by the translator as the first and the most important step for a good translation. Hence, the content of any word production reflects some part of reality. Comprehending any text includes comprehending the reality’s different dimensions as reflected in the text and, thus illuminating the connection of reality features.
Аннотация
Исследование под названием ((«Понимание реальности» средство полно
... Show MoreBurnishing improves fatigue strength, surface hardness and decrease surface roughness of metal because this process transforms tensile residual stresses into compressive residual stresses. Roller burnishing tool is used in the present work on low carbon steel (AISI 1008) specimens. In this work, different experiments were used to study the influence of feed parameter and speed parameter in burnishing process on fatigue strength, surface roughness and surface hardness of low carbon steel (AISI 1008) specimens. The first parameter used is feed values which were (0.6, 0.8, and 1) mm at constant speed (370) rpm, while the second parameter used is speed at values (540, 800 and 1200) rpm and at constant feed (1) mm. The results of the fatigue
... Show MoreIn this paper, the concept of soft closed groups is presented using the soft ideal pre-generalized open and soft pre-open, which are -ᶅ- - -closed sets " -closed", Which illustrating several characteristics of these groups. We also use some games and - Separation Axiom, such as (Ʈ0, Ӽ, ᶅ) that use many tables and charts to illustrate this. Also, we put some proposals to study the relationship between these games and give some examples.