Iris recognition occupies an important rank among the biometric types of approaches as a result of its accuracy and efficiency. The aim of this paper is to suggest a developed system for iris identification based on the fusion of scale invariant feature transforms (SIFT) along with local binary patterns of features extraction. Several steps have been applied. Firstly, any image type was converted to grayscale. Secondly, localization of the iris was achieved using circular Hough transform. Thirdly, the normalization to convert the polar value to Cartesian using Daugman’s rubber sheet models, followed by histogram equalization to enhance the iris region. Finally, the features were extracted by utilizing the scale invariant feature transformation and local binary pattern. Some sigma and threshold values were used for feature extraction, which achieved the highest rate of recognition. The programming was implemented by using MATLAB 2013. The matching was performed by applying the city block distance. The iris recognition system was built with the use of iris images for 30 individuals in the CASIA v4. 0 database. Every individual has 20 captures for left and right, with a total of 600 pictures. The main findings showed that the values of recognition rates in the proposed system are 98.67% for left eyes and 96.66% for right eyes, among thirty subjects.
The purpose of current study is to analyze the computer textbooks content for intermediate stage in Iraq according to the theory of multiple intelligence. By answering the following question “what is the percentage of availability of multiple intelligence in the content of the computer textbooks on intermediate stage (grade I, II) for the academic year (2017-2018)? The researcher followed the descriptive analytical research approach (content analysis), and adopted an explicit idea for registration. The research tool was prepared according the Gardner’s classification of multiple intelligence. It has proven validity and reliability. The study found the percentage of multiple intelligence in the content of computer textbooks for the in
... Show MoreWeibull distribution is considered as one of the most widely distribution applied in real life, Its similar to normal distribution in the way of applications, it's also considered as one of the distributions that can applied in many fields such as industrial engineering to represent replaced and manufacturing time ,weather forecasting, and other scientific uses in reliability studies and survival function in medical and communication engineering fields.
In this paper, The scale parameter has been estimated for weibull distribution using Bayesian method based on Jeffery prior information as a first method , then enhanced by improving Jeffery prior information and then used as a se
... Show MoreAlgorithms for Arabic stemming available in two main types which are root-based approach and stem-based approach. Both types have problems which have been solved in the proposed stemmer which combined rules of both main types and based on Arabic patterns (Tafealat1) to find the added letters. The proposed stemmer achieved root exploration ratio (99.08) and fault ratio (0.9).
As an important resource, entanglement light source has been used in developing quantum information technologies, such as quantum key distribution(QKD). There are few experiments implementing entanglement-based deterministic QKD protocols since the security of existing protocols may be compromised in lossy channels. In this work, we report on a loss-tolerant deterministic QKD experiment which follows a modified “Ping-Pong”(PP) protocol. The experiment results demonstrate for the first time that a secure deterministic QKD session can be fulfilled in a channel with an optical loss of 9 dB, based on a telecom-band entangled photon source. This exhibits a conceivable prospect of ultilizing entanglement light source in real-life fiber-based
... Show MoreThis Research deals with estimation the reliability function for two-parameters Exponential distribution, using different estimation methods ; Maximum likelihood, Median-First Order Statistics, Ridge Regression, Modified Thompson-Type Shrinkage and Single Stage Shrinkage methods. Comparisons among the estimators were made using Monte Carlo Simulation based on statistical indicter mean squared error (MSE) conclude that the shrinkage method perform better than the other methods
Thirty six bacteria were isolated from various sourcesc (soil, starch, cooked rice and other foods) and subjected to a series of primary screening tests to obtain the optimal isolation to production of amylase. The volume of producing zone by logal indicator for (Seven) isolates of the secondary screening by measuring the enzymatic activity and specific enzymatic activity. The isolate A4 was found to be the most efficient for production of amylase. Then this isolate was diagnosed through microscopic, vitek 2 system technique. in addition by gentic diagnesis through gene 16s of the genes nitrogen bases by use the polymerase chain reaction (PCR) which reached 1256 bases. In comparison to the available information at the National Center for
... Show MoreRecommender Systems are tools to understand the huge amount of data available in the internet world. Collaborative filtering (CF) is one of the most knowledge discovery methods used positively in recommendation system. Memory collaborative filtering emphasizes on using facts about present users to predict new things for the target user. Similarity measures are the core operations in collaborative filtering and the prediction accuracy is mostly dependent on similarity calculations. In this study, a combination of weighted parameters and traditional similarity measures are conducted to calculate relationship among users over Movie Lens data set rating matrix. The advantages and disadvantages of each measure are spotted. From the study, a n
... Show MoreIn the present work usedNd:YAG laser systems of different output characteristic were employed to study the drilling process of material used in scientific and industrial fields. This material include Manganese hard steel. Our study went into the affecting parameters in drilling of Manganese hard steel by laser. Drilling process is achieved through material absorption of part of the incident laser beam. It is the resultant of interfering both, laser beam and material properties and the focusing conditions of the beam. The results as shown that the increase in the laser pulse energy over the used level has raised the hole diameter, depth and increased the hole taper. In addition to that a hole taper was affected by the laser energy, the fo
... Show MoreCompression of speech signal is an essential field in signal processing. Speech compression is very important in today’s world, due to the limited bandwidth transmission and storage capacity. This paper explores a Contourlet transformation based methodology for the compression of the speech signal. In this methodology, the speech signal is analysed using Contourlet transformation coefficients with statistic methods as threshold values, such as Interquartile Filter (IQR), Average Absolute Deviation (AAD), Median Absolute Deviation (MAD) and standard deviation (STD), followed by the application of (Run length encoding) They are exploited for recording speech in different times (5, 30, and 120 seconds). A comparative study of performance
... Show More