Background:Measurement of hemoglobin A1c (A1C) is a renowned tactic for gauging long-term glycemic control, and exemplifies an outstanding influence to the quality of care in diabetic patients.The concept of targets is open to criticism; they may be unattainable, or limit what could be attained, and in addition they may be economically difficult to attain. However, without some form of targeted control of an asymptomatic condition it becomes difficult to promote care at allObjectives: The present article aims to address the most recent evidence-based global guidelines of A1C targets intended for glycemic control in Type 2 Diabetes Mellitus (T2D).Key messages:Rationale for Treatment Targets of A1C includesevidence for microvascular and macrovascular protectionand changes in quality of life. More or less stringent A1C goals may be appropriate for individual patients, andgoals should be individualized based on:duration of diabetes, age/life expectancy, comorbid conditions, CVD or advanced microvascular complications,hypoglycemia unawareness, and individual patient considerations
To achieve safe security to transfer data from the sender to receiver, cryptography is one way that is used for such purposes. However, to increase the level of data security, DNA as a new term was introduced to cryptography. The DNA can be easily used to store and transfer the data, and it becomes an effective procedure for such aims and used to implement the computation. A new cryptography system is proposed, consisting of two phases: the encryption phase and the decryption phase. The encryption phase includes six steps, starting by converting plaintext to their equivalent ASCII values and converting them to binary values. After that, the binary values are converted to DNA characters and then converted to their equivalent complementary DN
... Show MoreComputer systems and networks are increasingly used for many types of applications; as a result the security threats to computers and networks have also increased significantly. Traditionally, password user authentication is widely used to authenticate legitimate user, but this method has many loopholes such as password sharing, brute force attack, dictionary attack and more. The aim of this paper is to improve the password authentication method using Probabilistic Neural Networks (PNNs) with three types of distance include Euclidean Distance, Manhattan Distance and Euclidean Squared Distance and four features of keystroke dynamics including Dwell Time (DT), Flight Time (FT), mixture of (DT) and (FT), and finally Up-Up Time (UUT). The resul
... Show MoreBiometrics is widely used with security systems nowadays; each biometric modality can be useful and has distinctive properties that provide uniqueness and ambiguity for security systems especially in communication and network technologies. This paper is about using biometric features of fingerprint, which is called (minutiae) to cipher a text message and ensure safe arrival of data at receiver end. The classical cryptosystems (Caesar, Vigenère, etc.) became obsolete methods for encryption because of the high-performance machines which focusing on repetition of the key in their attacks to break the cipher. Several Researchers of cryptography give efforts to modify and develop Vigenère cipher by enhancing its weaknesses.
... Show MoreAbstract
This paper presents mechanical and electrical design, and implementation process of industrial robot, 3-DoF type SCARA (selective compliment assembly robot arm),with two rotations and one translation used for welding applications.The design process also included the controller design which was based on PLC(programmable logic controller) as well as selection of mechanical and electrical components.The challenge was to use the available components in Iraq with reasonable costs. The robot mentioned is fully automated using programmable logic controller PLC(Zelio type SR3-B261BD),with 16inputs and 10 outputs. The PLC was implemented in FBD logic to obtain three different automatic motions with hi
... Show MoreThe primary objective of this paper is to improve a biometric authentication and classification model using the ear as a distinct part of the face since it is unchanged with time and unaffected by facial expressions. The proposed model is a new scenario for enhancing ear recognition accuracy via modifying the AdaBoost algorithm to optimize adaptive learning. To overcome the limitation of image illumination, occlusion, and problems of image registration, the Scale-invariant feature transform technique was used to extract features. Various consecutive phases were used to improve classification accuracy. These phases are image acquisition, preprocessing, filtering, smoothing, and feature extraction. To assess the proposed
... Show MoreImage databases are increasing exponentially because of rapid developments in social networking and digital technologies. To search these databases, an efficient search technique is required. CBIR is considered one of these techniques. This paper presents a multistage CBIR to address the computational cost issues while reasonably preserving accuracy. In the presented work, the first stage acts as a filter that passes images to the next stage based on SKTP, which is the first time used in the CBIR domain. While in the second stage, LBP and Canny edge detectors are employed for extracting texture and shape features from the query image and images in the newly constructed database. The p