A true random TTL pulse generator was implemented and investigated for quantum key distribution systems. The random TTL signals are generated by low cost components available in the local markets. The TTL signals are obtained by using true random binary sequences based on registering photon arrival time difference registered in coincidence windows between two single – photon detectors. The true random TTL pulse generator performance was tested by using time to digital converters which gives accurate readings for photon arrival time. The proposed true random pulse TTL generator can be used in any quantum -key distribution system for random operation of the transmitters for these systems
There are many applied Economic studies that have found positive nexus between financial development and poverty reduction in developing countries. Iraq has witnessed an increasing rate of poverty during the period 1980-2010 due to many internal and external factors such as wars, economic sanctions, inflation, a high rate of unemployment, and political and security instability. Therefore, the investigation about the solutions to reduce poverty becomes very necessary, and enhancing the financial development in Iraq is one of these options. This is due to that the financial development could reduce the poverty rates through two channels: the first is direct via the offering of the loans and other financial facilities to the poor, a
... Show MoreBackground: The objective of this in vitro study was to evaluate the vertical marginal fit of crowns fabricated with ZrO2 CAD/CAM, before and after porcelain firing cycles and after glaze cycles. Materials and Methods: An acrylic resin model of a left maxillary first molar was prepared and duplicated to have Nickel-Chromium master die. Ten die stone dies were sent to the CAD/CAM (Amann Girrbach) for crowns fabrication. Marginal gaps along vertical planes were measured at four indentations at the (mid mesial, mid distal, mid buccal, mid palatal) before (Time 0) and after porcelain firing cycles (Time 1) and after glaze cycles (Time 2) using a light microscope at a magnification of ×100. One way ANOVA LSD tests were performed to determine wh
... Show MoreThe aim of the current study was to develop a nanostructured double-layer for hydrophobic molecules delivery system. The developed double-layer consisted of polyethylene glycol-based polymeric (PEG) followed by gelatin sub coating of the core hydrophobic molecules containing sodium citrate. The polymeric composition ratio of PEG and the amount of the sub coating gelatin were optimized using the two-level fractional method. The nanoparticles were characterized using AFM and FT-IR techniques. The size of these nano capsules was in the range of 39-76 nm depending on drug loading concentration. The drug was effectively loaded into PEG-Gelatin nanoparticles (≈47%). The hydrophobic molecules-release characteristics in terms of controlled-releas
... Show MoreCryptography is a method used to mask text based on any encryption method, and the authorized user only can decrypt and read this message. An intruder tried to attack in many manners to access the communication channel, like impersonating, non-repudiation, denial of services, modification of data, threatening confidentiality and breaking availability of services. The high electronic communications between people need to ensure that transactions remain confidential. Cryptography methods give the best solution to this problem. This paper proposed a new cryptography method based on Arabic words; this method is done based on two steps. Where the first step is binary encoding generation used t
... Show MoreToday with increase using social media, a lot of researchers have interested in topic extraction from Twitter. Twitter is an unstructured short text and messy that it is critical to find topics from tweets. While topic modeling algorithms such as Latent Semantic Analysis (LSA) and Latent Dirichlet Allocation (LDA) are originally designed to derive topics from large documents such as articles, and books. They are often less efficient when applied to short text content like Twitter. Luckily, Twitter has many features that represent the interaction between users. Tweets have rich user-generated hashtags as keywords. In this paper, we exploit the hashtags feature to improve topics learned
Authentication is the process of determining whether someone or something is, in fact, who or what it is declared to be. As the dependence upon computers and computer networks grows, the need for user authentication has increased. User’s claimed identity can be verified by one of several methods. One of the most popular of these methods is represented by (something user know), such as password or Personal Identification Number (PIN). Biometrics is the science and technology of authentication by identifying the living individual’s physiological or behavioral attributes. Keystroke authentication is a new behavioral access control system to identify legitimate users via their typing behavior. The objective of this paper is to provide user
... Show MoreIris research is focused on developing techniques for identifying and locating relevant biometric features, accurate segmentation and efficient computation while lending themselves to compression methods. Most iris segmentation methods are based on complex modelling of traits and characteristics which, in turn, reduce the effectiveness of the system being used as a real time system. This paper introduces a novel parameterized technique for iris segmentation. The method is based on a number of steps starting from converting grayscale eye image to a bit plane representation, selection of the most significant bit planes followed by a parameterization of the iris location resulting in an accurate segmentation of the iris from the origin
... Show MoreThe quality of Global Navigation Satellite Systems (GNSS) networks are considerably influenced by the configuration of the observed baselines. Where, this study aims to find an optimal configuration for GNSS baselines in terms of the number and distribution of baselines to improve the quality criteria of the GNSS networks. First order design problem (FOD) was applied in this research to optimize GNSS network baselines configuration, and based on sequential adjustment method to solve its objective functions.
FOD for optimum precision (FOD-p) was the proposed model which based on the design criteria of A-optimality and E-optimality. These design criteria were selected as objective functions of precision, whic
... Show More