Merging biometrics with cryptography has become more familiar and a great scientific field was born for researchers. Biometrics adds distinctive property to the security systems, due biometrics is unique and individual features for every person. In this study, a new method is presented for ciphering data based on fingerprint features. This research is done by addressing plaintext message based on positions of extracted minutiae from fingerprint into a generated random text file regardless the size of data. The proposed method can be explained in three scenarios. In the first scenario the message was used inside random text directly at positions of minutiae in the second scenario the message was encrypted with a choosen word before ciphering inside random text. In the third scenario the encryption process insures a correct restoration of original message. Experimental results show that the proposed cryptosystem works well and secure due to the huge number of fingerprints may be used by attacker to attempt message extraction where all fingerprints but one will give incorrect results and the message will not represent original plain-text, also this method ensures that any intended tamper or simple damage will be discovered due to failure in extracting proper message even if the correct fingerprint are used.
This study examines the vibrations produced by hydropower operations to improve embankment dam safety. This study consists of two parts: In the first part, ANSYS-CFX was used to generate a three-dimensional (3-D) finite volume (FV) model to simulate a vertical Francis turbine unit in the Mosul hydropower plant. The pressure pattern result of the turbine model was transformed into the dam body to show how the turbine unit's operation affects the dam's stability. The upstream reservoir conditions, various flow rates, and fully open inlet gates were considered. In the second part of this study, a 3-D FE Mosul dam model was simulated using an ANSYS program. The operational turbine model's water pressure pattern is conveyed t
... Show MoreAssessing water quality provides a scientific foundation for the development and management of water resources. The objective of the research is to evaluate the impact treated effluent from North Rustumiyia wastewater treatment plant (WWTP) on the quality of Diyala river. The model of the artificial neural network (ANN) and factor analysis (FA) based on Nemerow pollution index (NPI). To define important water quality parameters for North Al-Rustumiyia for the line(F2), the Nemerow Pollution Index was introduced. The most important parameters of assessment of water variation quality of wastewater were the parameter used in the model: biochemical oxygen demand (BOD), chemical oxygen dem
Summary
The subject ( meaning of added verbs) is one of the main subjects
which study in morphology since in Arabic language. It is include the meaning
of each format, and the increased meaning occurred by this increment in the
verbs.
The (strain) is one of very important meaning in this subject, it takes a
wide area of morphology studies, and interesting of scientists and
researchists.
There are two famous formats for this meaning; (infa la انفع
ل ), and (ifta
la افتع
ل ). Also There are another formats for the same meaning, but less than
the first two in use, they are; (taf ala تفعّ
ل ), (tafa ala تفاع
ل ), (taf lala ) ,(تفعل
ل
ifanlala افعنلل ), (ifanla .(
In this paper, we are mainly concerned with estimating cascade reliability model (2+1) based on inverted exponential distribution and comparing among the estimation methods that are used . The maximum likelihood estimator and uniformly minimum variance unbiased estimators are used to get of the strengths and the stress ;k=1,2,3 respectively then, by using the unbiased estimators, we propose Preliminary test single stage shrinkage (PTSSS) estimator when a prior knowledge is available for the scale parameter as initial value due past experiences . The Mean Squared Error [MSE] for the proposed estimator is derived to compare among the methods. Numerical results about conduct of the considered
... Show MoreDocument clustering is the process of organizing a particular electronic corpus of documents into subgroups of similar text features. Formerly, a number of conventional algorithms had been applied to perform document clustering. There are current endeavors to enhance clustering performance by employing evolutionary algorithms. Thus, such endeavors became an emerging topic gaining more attention in recent years. The aim of this paper is to present an up-to-date and self-contained review fully devoted to document clustering via evolutionary algorithms. It firstly provides a comprehensive inspection to the document clustering model revealing its various components with its related concepts. Then it shows and analyzes the principle research wor
... Show MoreDiscrete logarithms are applied in many cryptographic problems . For instance , in public key . and for construction of sets with disti nct sums of k-clcments. The purpose o r this paper
is to modify the method ol' informationl1·iding using discrete logarithms , introduce new properties of St - sets , uscdthe direct product of groups to construct cyclic group and finally, present modified method for knapsack &
... Show MoreTrue random number generators are essential components for communications to be conconfidentially secured. In this paper a new method is proposed to generate random sequences of numbers based on the difference of the arrival times of photons detected in a coincidence window between two single-photon counting modules
In this study, an efficient compression system is introduced, it is based on using wavelet transform and two types of 3Dimension (3D) surface representations (i.e., Cubic Bezier Interpolation (CBI)) and 1 st order polynomial approximation. Each one is applied on different scales of the image; CBI is applied on the wide area of the image in order to prune the image components that show large scale variation, while the 1 st order polynomial is applied on the small area of residue component (i.e., after subtracting the cubic Bezier from the image) in order to prune the local smoothing components and getting better compression gain. Then, the produced cubic Bezier surface is subtracted from the image signal to get the residue component. Then, t
... Show More