Objective of this work is the mixing between human biometric characteristics and unique attributes of the computer in order to protect computer networks and resources environments through the development of authentication and authorization techniques. In human biometric side has been studying the best methods and algorithms used, and the conclusion is that the fingerprint is the best, but it has some flaws. Fingerprint algorithm has been improved so that their performance can be adapted to enhance the clarity of the edge of the gully structures of pictures fingerprint, taking into account the evaluation of the direction of the nearby edges and repeat. In the side of the computer features, computer and its components like human have unique characteristics. A program has been produced in the Visual Basic environment. The goal of this program is to get the computer characteristics and merge them with human characteristics to produce powerful algorithms of authentication and authorization can be used to protect the resources that are stored in the computer networks environments through the creation of software modules and interactive interfaces to accomplish this purpose.
Background: Nanotechnology represents a new science that promises to provide a broad range of uses and improved technologies for biological and biomedical applications. One of the reasons behind the intense interest is that nanotechnology permits synthesis of materials that have structure is less than 100 nanometers. The present work revealed the effect of zinc oxide nanoparticles (ZnO NPs) on Streptococcus mutans of Human Saliva in comparison to de-ionized water. Materials and methods: Streptococcus mutans were isolated from saliva of forty eight volunteers of both sexes their age range between 18-22 years and then purified and diagnosed according to morphological characteristic and biochemical tests. Different concentrations of ZnO NPs w
... Show MoreObjective: The present study investigates whether the exposure to low-power diode laser induces denaturation in red blood cell (RBC) membrane protein composition, and determines the irradiation time for when denaturation of membrane protein process begins. Background: A low-energy laser has been used extensively in medical applications. Several studies indicated significant positive effects of laser therapy on biological systems. In contrast, other studies reported that laser induced unwanted changes in cell structure and biological systems. The present work studied the effect of irradiation time of low-power diode laser on the structure of membrane proteins of human RBCs. Materials and methods: The RBC suspension was divided into five equa
... Show MoreThe hydatid materials were collected and studied, so they were contained 50 fertile human hydatid cases {33 (66%) females and 17 (34%) males}. They were collected from Al-Ramadi General Hospital during the period from December, 2003 to July, 2004 .Cysts were observed in 40 (80%) from the liver, 5 (10%) from the lungs, 3 (6%) from the kidney and 2 (4%) cysts from urinary bladder. The specimens were taken from patients of different ages. The in vitro viability of protoscoleces was assessed on the basis of flame cell activity and eosein exclusion, which were considered as criteria to determine the death or viability of protoscoleces. In addition to this movement (flame cell activity), another motility like constriction – relaxatio
... Show MoreThe synthesis, characterization and mesomorphic properties of two new series of triazine-core based liquid crystals have been investigated. The amino triazine derivatives were characterized by elemental analysis, Fourier transforms infrared (FTIR), 1HNMR and mass spectroscopy. The liquid crystalline properties of these compounds were examined by differential scanning calorimetry (DSC) and polarizing optical microscopy (POM). DSC and POM confirmed nematic (N) and columnar mesophase textures of the materials. The formation of mesomorphic properties was found to be dependent on the number of methylene unit in alkoxy side chains.
In this paper a new method is proposed to perform the N-Radon orthogonal frequency division multiplexing (OFDM), which are equivalent to 4-quadrature amplitude modulation (QAM), 16-QAM, 64-QAM, 256-QAM, ... etc. in spectral efficiency. This non conventional method is proposed in order to reduce the constellation energy and increase spectral efficiency. The proposed method gives a significant improvement in Bit Error Rate performance, and keeps bandwidth efficiency and spectrum shape as good as conventional Fast Fourier Transform based OFDM. The new structure was tested and compared with conventional OFDM for Additive White Gaussian Noise, flat, and multi-path selective fading channels. Simulation tests were generated for different channels
... Show MoreThe huge amount of documents in the internet led to the rapid need of text classification (TC). TC is used to organize these text documents. In this research paper, a new model is based on Extreme Machine learning (EML) is used. The proposed model consists of many phases including: preprocessing, feature extraction, Multiple Linear Regression (MLR) and ELM. The basic idea of the proposed model is built upon the calculation of feature weights by using MLR. These feature weights with the extracted features introduced as an input to the ELM that produced weighted Extreme Learning Machine (WELM). The results showed a great competence of the proposed WELM compared to the ELM.
The quality of Global Navigation Satellite Systems (GNSS) networks are considerably influenced by the configuration of the observed baselines. Where, this study aims to find an optimal configuration for GNSS baselines in terms of the number and distribution of baselines to improve the quality criteria of the GNSS networks. First order design problem (FOD) was applied in this research to optimize GNSS network baselines configuration, and based on sequential adjustment method to solve its objective functions.
FOD for optimum precision (FOD-p) was the proposed model which based on the design criteria of A-optimality and E-optimality. These design criteria were selected as objective functions of precision, whic
... Show MoreToday with increase using social media, a lot of researchers have interested in topic extraction from Twitter. Twitter is an unstructured short text and messy that it is critical to find topics from tweets. While topic modeling algorithms such as Latent Semantic Analysis (LSA) and Latent Dirichlet Allocation (LDA) are originally designed to derive topics from large documents such as articles, and books. They are often less efficient when applied to short text content like Twitter. Luckily, Twitter has many features that represent the interaction between users. Tweets have rich user-generated hashtags as keywords. In this paper, we exploit the hashtags feature to improve topics learned