Merging biometrics with cryptography has become more familiar and a great scientific field was born for researchers. Biometrics adds distinctive property to the security systems, due biometrics is unique and individual features for every person. In this study, a new method is presented for ciphering data based on fingerprint features. This research is done by addressing plaintext message based on positions of extracted minutiae from fingerprint into a generated random text file regardless the size of data. The proposed method can be explained in three scenarios. In the first scenario the message was used inside random text directly at positions of minutiae in the second scenario the message was encrypted with a choosen word before ciphering inside random text. In the third scenario the encryption process insures a correct restoration of original message. Experimental results show that the proposed cryptosystem works well and secure due to the huge number of fingerprints may be used by attacker to attempt message extraction where all fingerprints but one will give incorrect results and the message will not represent original plain-text, also this method ensures that any intended tamper or simple damage will be discovered due to failure in extracting proper message even if the correct fingerprint are used.
As an important resource, entanglement light source has been used in developing quantum information technologies, such as quantum key distribution(QKD). There are few experiments implementing entanglement-based deterministic QKD protocols since the security of existing protocols may be compromised in lossy channels. In this work, we report on a loss-tolerant deterministic QKD experiment which follows a modified “Ping-Pong”(PP) protocol. The experiment results demonstrate for the first time that a secure deterministic QKD session can be fulfilled in a channel with an optical loss of 9 dB, based on a telecom-band entangled photon source. This exhibits a conceivable prospect of ultilizing entanglement light source in real-life fiber-based
... Show MoreThis Research deals with estimation the reliability function for two-parameters Exponential distribution, using different estimation methods ; Maximum likelihood, Median-First Order Statistics, Ridge Regression, Modified Thompson-Type Shrinkage and Single Stage Shrinkage methods. Comparisons among the estimators were made using Monte Carlo Simulation based on statistical indicter mean squared error (MSE) conclude that the shrinkage method perform better than the other methods
The purpose of this research is defining the main factors influencing on decision of management system on sensitive data in cloud. The framework is proposed to enhance management information systems decision on sensitive information in cloud environment. The structured interview with several security experts working on cloud computing security to investigate the main objective of framework and suitability of instrument, a pilot study conducts to test the instrument. The validity and reliability test results expose that study can be expanded and lead to final framework validation. This framework using multilevel related to Authorization, Authentication, Classification and identity anonymity, and save and verify, to enhance management
... Show MoreAbstract
A surface fitting model is developed based on calorimeter data for two famous brands of household compressors. Correlation equations of ten coefficient polynomials were found as a function of refrigerant saturating and evaporating temperatures in range of (-35℃ to -10℃) using Matlab software for cooling capacity, power consumption, and refrigerant mass flow rate.
Additional correlations equations for these variables as a quick choice selection for a proper compressor use at ASHRAE standard that cover a range of swept volume range (2.24-11.15) cm3.
The result indicated that these surface fitting models are accurate with in ± 15% for 72 compressors model of cooling cap
... Show MoreIt is so much noticeable that initialization of architectural parameters has a great impact on whole learnability stream so that knowing mathematical properties of dataset results in providing neural network architecture a better expressivity and capacity. In this paper, five random samples of the Volve field dataset were taken. Then a training set was specified and the persistent homology of the dataset was calculated to show impact of data complexity on selection of multilayer perceptron regressor (MLPR) architecture. By using the proposed method that provides a well-rounded strategy to compute data complexity. Our method is a compound algorithm composed of the t-SNE method, alpha-complexity algorithm, and a persistence barcod
... Show MoreIn our research, several different Statics solutions have been implemented in the processing of seismic data in the south of Iraq for (2D) line seismic survey (AK18) of Abu-khama project with length 32.4 Km and their corresponding results have been compared in order to find optimum static solutions. The static solutions based on the tomographic-principle or combining the low frequency components of field statics with high frequency ones of refraction statics can provide a reasonable static solution for seismic data in the south of Iraq. The quality of data was bad and unclear in the seismic signal, but after applying field statics there is an enhancement of data quality. The Residual static correction improved the qualities of seis
... Show MoreIn this work , an effective procedure of Box-Behnken based-ANN (Artificial Neural Network) and GA (Genetic Algorithm) has been utilized for finding the optimum conditions of wt.% of doping elements (Ce,Y, and Ge) doped-aluminizing-chromizing of Incoloy 800H . ANN and Box-Behnken design method have been implanted for minimizing hot corrosion rate kp (10-12g2.cm-4.s-1) in Incoloy 800H at 900oC . ANN was used for estimating the predicted values of hot corrosion rate kp (10-12g2.cm-4.s-1) . The optimal wt.% of doping elements combination to obtain minimum hot corrosion rate was calculated using genetic alg
... Show MoreAbstract:
This research aims to the importance of oil in achieving economic
security in the Arab. Oil is not an ordinary subject and returns it significance to
the followings:
1. The importance of skipping a source of energy.
2. The importance of oil as raw material for petrochemical industry.
3. The importance of the oil sector as an area of foreign investment
4. The importance of oil in the marketing activities, transport, insurance
and various services
In addition to the importance of oil in general and the Arab oil has
additional strategic advantages such as geographic location, And the
magnitude of reserves and production of heavy investment costs are relatively
simple, And the ability to meet the
One of the primary problems in internet is security, mostly when computer utilization is increasing in all social and business areas. So, the secret communications through public and private channels are the major goal of researchers. Information hiding is one of methods to obtain a security communication medium and protecting the data during transmission.
This research offers in a new method using two levels to hide, the first level is hiding by embedding and addition but the second level is hiding by injection. The first level embeds a secret message in one bit in the LSB in the FFT and the addition of one kashida. Subtraction of two random images (STRI) is RNG to find positions for hiding within the text. The second level is the in
Image pattern classification is considered a significant step for image and video processing.Although various image pattern algorithms have been proposed so far that achieved adequate classification,achieving higher accuracy while reducing the computation time remains challenging to date. A robust imagepattern classification method is essential to obtain the desired accuracy. This method can be accuratelyclassify image blocks into plain, edge, and texture (PET) using an efficient feature extraction mechanism.Moreover, to date, most of the existing studies are focused on evaluating their methods based on specificorthogonal moments, which limits the understanding of their potential application to various DiscreteOrthogonal Moments (DOMs). The
... Show More