In modern era, which requires the use of networks in the transmission of data across distances, the transport or storage of such data is required to be safe. The protection methods are developed to ensure data security. New schemes are proposed that merge crypto graphical principles with other systems to enhance information security. Chaos maps are one of interesting systems which are merged with cryptography for better encryption performance. Biometrics is considered an effective element in many access security systems. In this paper, two systems which are fingerprint biometrics and chaos logistic map are combined in the encryption of a text message to produce strong cipher that can withstand many types of attacks. The histogram analysis of ciphertext shows that the resulted cipher is robust. Each character in the plaintext has different representations in the ciphertext even if the characters are repeated through the message. The strength of generated cipher was measured through brute force attackers, they were unable to deduce the key from the knowledge about pairs of plaintext- ciphertext due to the fact that each occurrence of characters in the message will have different shift value, and as a result a diverse representation will be obtained for same characters of the message.
This paper reports a.c., d.c. conductivity and dielectric behavior of Ep-hybrid composite with12 Vol.% Kevlar-Carbon hybrid . D.C. conductivity measurements are conducted on the graded composites by using an electrometer over the temperature range from (293-413) K. It was shown then that conductivity increases by increasing number of Kevlar –Carbon fiber layers (Ep1, Ep2, Ep3), due to the high electrical conductivity of Carbon fiber. To identify the mechanism governing the conduction, the activation energies at low temperature region (LTR) and at high temperature region (HTR) have been calculated. The activation energy values for hybrid composite decrease with increasing number of fiber layers. The a.c. conductivity was measured over fr
... Show MoreThis investigation aims to study some properties of lightweight aggregate concrete reinforced by mono or hybrid fibers of different sizes and types. In this research, the considered lightweight aggregate was Light Expanded Clay Aggregate while the adopted fibers included hooked, straight, polypropylene, and glass. Eleven lightweight concrete mixes were considered, These mixes comprised of; one plain concrete mix (without fibers), two reinforced concrete mixtures of mono fiber (hooked or straight fibers), six reinforced concrete mixtures of double hybrid fibers, and two reinforced concrete mixtures of triple hybrid fibers. Hardened concrete properties were investigated in this study. G
This investigation aims to study some properties of lightweight aggregate concrete reinforced by mono or hybrid fibers of different sizes and types. In this research, the considered lightweight aggregate was Light Expanded Clay Aggregate while the adopted fibers included hooked, straight, polypropylene, and glass. Eleven lightweight concrete mixes were considered, These mixes comprised of; one plain concrete mix (without fibers), two reinforced concrete mixtures of mono fiber (hooked or straight fibers), six reinforced concrete mixtures of double hybrid fibers, and two reinforced concrete mixtures of triple hybrid fibers. Hardened concrete properties were investigated in this study. G
Regression testing is a crucial phase in the software development lifecycle that makes sure that new changes/updates in the software system don’t introduce defects or don’t affect adversely the existing functionalities. However, as the software systems grow in complexity, the number of test cases in regression suite can become large which results into more testing time and resource consumption. In addition, the presence of redundant and faulty test cases may affect the efficiency of the regression testing process. Therefore, this paper presents a new Hybrid Framework to Exclude Similar & Faulty Test Cases in Regression Testing (ETCPM) that utilizes automated code analysis techniques and historical test execution data to
... Show MoreThe increased use of hybrid PET /CT scanners combining detailed anatomical information along withfunctional data has benefits for both diagnostic and therapeutic purposes. This presented study is to makecomparison of cross sections to produce 18F , 82Sr and68Ge via different reactions with particle incident energy up to 60 MeV as a part of systematic studies on particle-induced activations on enriched natNe, natRb, natGa 18O,85Rb, and 69Ga targets, theoretical calculation of production yield, calculation of requiredtarget and suggestion of optimum reaction to produce: Fluorine-18 , Strontium-82 andGermanium-68 touse in Hybrid Machines PET/CT Scanners.
Tax information system is one of the most important means that help the tax administration to reach the real income of the taxpayer, and the problem of research came in the General Authority for Tax ,The next question (Does the control carried out by the Central Bank on foreign remittances to reach the real income of the taxpayer). The research is gaining importance by focusing on the Central Bank's control over foreign remittances, and how to use this control to finance the tax information system. The relationship between the Central Bank's control over foreign remittances and the tax information system.The study has reached a number of recommendations, the most important of which are the following: The work of an integrated information
... Show MoreABSTRACT
The study aimed to evaluate the information label of some local pickle products and estimate sodium benzoate therein. 85 samples of locally made pickles were collected from Baghdad city markets and randomly from five different areas in Baghdad it included (Al-Shula, Al-Bayaa, Al-Nahrawan, Al-Taji, and Abu Ghraib), which were divided into groups P1, P2, P3, P4 and P5, respectively, according to those areas, samples information label was scanned and compared with the Iraqi standard specification for the information card of packaged and canned food IQS 230, the results showed that 25.9% of the samples were devoid of the indication card informa
... Show MoreThe aim of the present research is to investigate the effecting of pH parameter on the feasibility of lead removal from simulated wastewater using an electrochemical system. Electrocoagulation method is one of electrochemical technology which is used widely to treat industrial wastewater. Parameters affecting this operation, such as initial metal concentration, applied current, stirrer speed, and contact time of electroprocessing were taken as 155ppm, 1.5 Ampere, 150 rpm, 60 minutes respectively. While pH of the simulated wastewater was in the range of 2 to 12 in the experiments. It was found from the results that pH is an important parameter affecting lead removal operation. The best value of pH parameter is appro
... Show MoreBackground: one of the complications of power bleaching is surface roughness of enamel which increases the possibility of post bleaching teeth discoloration. The aim of the present study is to evaluate the effect of toothpaste containing nano hydroxyapatite, NovaMin and kin sense fluoride on surface roughness of human tooth enamel after laser bleaching with 35% hydrogen peroxide bleaching gel. Materials and Methods: Twenty human enamel incisors were cleaned and their labial surface polished up to #1200, then categorized into four equal groups; first group kept without bleaching as a control group, while the remaining three experimental groups were bleached with 35% hydrogen peroxide, and each group treated with a restore paste containing o
... Show MoreImage compression is one of the data compression types applied to digital images in order to reduce their high cost for storage and/or transmission. Image compression algorithms may take the benefit of visual sensitivity and statistical properties of image data to deliver superior results in comparison with generic data compression schemes, which are used for other digital data. In the first approach, the input image is divided into blocks, each of which is 16 x 16, 32 x 32, or 64 x 64 pixels. The blocks are converted first into a string; then, encoded by using a lossless and dictionary-based algorithm known as arithmetic coding. The more occurrence of the pixels values is codded in few bits compare with pixel values of less occurre
... Show More