This paper presents a combination of enhancement techniques for fingerprint images affected by different type of noise. These techniques were applied to improve image quality and come up with an acceptable image contrast. The proposed method included five different enhancement techniques: Normalization, Histogram Equalization, Binarization, Skeletonization and Fusion. The Normalization process standardized the pixel intensity which facilitated the processing of subsequent image enhancement stages. Subsequently, the Histogram Equalization technique increased the contrast of the images. Furthermore, the Binarization and Skeletonization techniques were implemented to differentiate between the ridge and valley structures and to obtain one pixel-wide lines. Finally, the Fusion technique was used to merge the results of the Histogram Equalization process with the Skeletonization process to obtain the new high contrast images. The proposed method was tested in different quality images from National Institute of Standard and Technology (NIST) special database 14. The experimental results are very encouraging and the current enhancement method appeared to be effective by improving different quality images.
Let G be a graph, each edge e of which is given a weight w(e). The shortest path problem is a path of minimum weight connecting two specified vertices a and b, and from it we have a pre-topology. Furthermore, we study the restriction and separators in pre-topology generated by the shortest path problems. Finally, we study the rate of liaison in pre-topology between two subgraphs. It is formally shown that the new distance measure is a metric
The estimation of the parameters of linear regression is based on the usual Least Square method, as this method is based on the estimation of several basic assumptions. Therefore, the accuracy of estimating the parameters of the model depends on the validity of these hypotheses. The most successful technique was the robust estimation method which is minimizing maximum likelihood estimator (MM-estimator) that proved its efficiency in this purpose. However, the use of the model becomes unrealistic and one of these assumptions is the uniformity of the variance and the normal distribution of the error. These assumptions are not achievable in the case of studying a specific problem that may include complex data of more than one model. To
... Show MoreIn this paper, the concept of soft closed groups is presented using the soft ideal pre-generalized open and soft pre-open, which are -ᶅ- - -closed sets " -closed", Which illustrating several characteristics of these groups. We also use some games and - Separation Axiom, such as (Ʈ0, Ӽ, ᶅ) that use many tables and charts to illustrate this. Also, we put some proposals to study the relationship between these games and give some examples.
Many of the key stream generators which are used in practice are LFSR-based in the sense that they produce the key stream according to a rule y = C(L(x)), where L(x) denotes an internal linear bit stream, produced by small number of parallel linear feedback shift registers (LFSRs), and C denotes some nonlinear compression function. In this paper we combine between the output sequences from the linear feedback shift registers with the sequences out from non linear key generator to get the final very strong key sequence
In this paper, we build a fuzzy classification system for classifying the nutritional status of children under 5 years old in Iraq using the Mamdani method based on input variables such as weight and height to determine the nutritional status of the child. Also, Classifying the nutritional status faces a difficult challenge in the medical field due to uncertainty and ambiguity in the variables and attributes that determine the categories of nutritional status for children, which are relied upon in medical diagnosis to determine the types of malnutrition problems and identify the categories or groups suffering from malnutrition to determine the risks faced by each group or category of children. Malnutrition in children is one of the most
... Show MoreBurnishing improves fatigue strength, surface hardness and decrease surface roughness of metal because this process transforms tensile residual stresses into compressive residual stresses. Roller burnishing tool is used in the present work on low carbon steel (AISI 1008) specimens. In this work, different experiments were used to study the influence of feed parameter and speed parameter in burnishing process on fatigue strength, surface roughness and surface hardness of low carbon steel (AISI 1008) specimens. The first parameter used is feed values which were (0.6, 0.8, and 1) mm at constant speed (370) rpm, while the second parameter used is speed at values (540, 800 and 1200) rpm and at constant feed (1) mm. The results of the fatigue
... Show MoreThe present paper stresses the direct effect of the situational dimension termed as “reality” on the authors’ thoughts and attitudes. Every text is placed within a particular situation which has to be correctly identified by the translator as the first and the most important step for a good translation. Hence, the content of any word production reflects some part of reality. Comprehending any text includes comprehending the reality’s different dimensions as reflected in the text and, thus illuminating the connection of reality features.
Аннотация
Исследование под названием ((«Понимание реальности» средство полно
... Show MoreHepatitis is one of the diseases that has become more developed in recent years in terms of the high number of infections. Hepatitis causes inflammation that destroys liver cells, and it occurs as a result of viruses, bacteria, blood transfusions, and others. There are five types of hepatitis viruses, which are (A, B, C, D, E) according to their severity. The disease varies by type. Accurate and early diagnosis is the best way to prevent disease, as it allows infected people to take preventive steps so that they do not transmit the difference to other people, and diagnosis using artificial intelligence gives an accurate and rapid diagnostic result. Where the analytical method of the data relied on the radial basis network to diagnose the
... Show MoreEssential approaches involving photons are among the most common uses of parallel optical computation due to their recent invention, ease of production, and low cost. As a result, most researchers have concentrated their efforts on it. The Basic Arithmetic Unit BAU is built using a three-step approach that uses optical gates with three states to configure the circuitry for addition, subtraction, and multiplication. This is a new optical computing method based on the usage of a radix of (2): a binary number with a signed-digit (BSD) system that includes the numbers -1, 0, and 1. Light with horizontal polarization (LHP) (↔), light with no intensity (LNI) (⥀), and light with vertical polarization (LVP) (↨) is represen
... Show MoreIn this paper, we used four classification methods to classify objects and compareamong these methods, these are K Nearest Neighbor's (KNN), Stochastic Gradient Descentlearning (SGD), Logistic Regression Algorithm(LR), and Multi-Layer Perceptron (MLP). Weused MCOCO dataset for classification and detection the objects, these dataset image wererandomly divided into training and testing datasets at a ratio of 7:3, respectively. In randomlyselect training and testing dataset images, converted the color images to the gray level, thenenhancement these gray images using the histogram equalization method, resize (20 x 20) fordataset image. Principal component analysis (PCA) was used for feature extraction, andfinally apply four classification metho
... Show More