Background:Measurement of hemoglobin A1c (A1C) is a renowned tactic for gauging long-term glycemic control, and exemplifies an outstanding influence to the quality of care in diabetic patients.The concept of targets is open to criticism; they may be unattainable, or limit what could be attained, and in addition they may be economically difficult to attain. However, without some form of targeted control of an asymptomatic condition it becomes difficult to promote care at allObjectives: The present article aims to address the most recent evidence-based global guidelines of A1C targets intended for glycemic control in Type 2 Diabetes Mellitus (T2D).Key messages:Rationale for Treatment Targets of A1C includesevidence for microvascular and macrovascular protectionand changes in quality of life. More or less stringent A1C goals may be appropriate for individual patients, andgoals should be individualized based on:duration of diabetes, age/life expectancy, comorbid conditions, CVD or advanced microvascular complications,hypoglycemia unawareness, and individual patient considerations
This work bases on encouraging a generous and conceivable estimation for modified an algorithm for vehicle travel times on a highway from the eliminated traffic information using set aside camera image groupings. The strategy for the assessment of vehicle travel times relies upon the distinctive verification of traffic state. The particular vehicle velocities are gotten from acknowledged vehicle positions in two persistent images by working out the distance covered all through elapsed past time doing mollification between the removed traffic flow data and cultivating a plan to unequivocally predict vehicle travel times. Erbil road data base is used to recognize road locales around road segments which are projected into the commended camera
... Show MoreRecent advances in wireless communication systems have made use of OFDM technique to achieve high data rate transmission. The sensitivity to frequency offset between the carrier frequencies of the transmitter and the receiver is one of the major problems in OFDM systems. This frequency offset introduces inter-carrier interference in the OFDM symbol and then the BER performance reduced. In this paper a Multi-Orthogonal-Band MOB-OFDM system based on the Discrete Hartley Transform (DHT) is proposed to improve the BER performance. The OFDM spectrum is divided into equal sub-bands and the data is divided between these bands to form a local OFDM symbol in each sub-band using DHT. The global OFDM symbol is formed from all sub-bands together using
... Show MoreIn recent years, the performance of Spatial Data Infrastructures for governments and companies is a task that has gained ample attention. Different categories of geospatial data such as digital maps, coordinates, web maps, aerial and satellite images, etc., are required to realize the geospatial data components of Spatial Data Infrastructures. In general, there are two distinct types of geospatial data sources exist over the Internet: formal and informal data sources. Despite the growth of informal geospatial data sources, the integration between different free sources is not being achieved effectively. The adoption of this task can be considered the main advantage of this research. This article addresses the research question of ho
... Show MoreAs a result of the increase in wireless applications, this led to a spectrum problem, which was often a significant restriction. However, a wide bandwidth (more than two-thirds of the available) remains wasted due to inappropriate usage. As a consequence, the quality of the service of the system was impacted. This problem was resolved by using cognitive radio that provides opportunistic sharing or utilization of the spectrum. This paper analyzes the performance of the cognitive radio spectrum sensing algorithm for the energy detector, which implemented by using a MATLAB Mfile version (2018b). The signal to noise ratio SNR vs. Pd probability of detection for OFDM and SNR vs. BER with CP cyclic prefix with energy dete
... Show MorePhotonic Crystal Fiber Interferometers (PCFIs) are greatly used
for sensing applications. This work presents the fabrication and
characterization of a relative humidity sensor based on Mach-
Zehnder Interferometer (MZI), which operates in reflection mode.
The humidity sensor operation based on the adsorption and
desorption of water vapour at the silica-air interface within the PCF.
The fabrication of this sensor is simple, it only includes splicing and
cleaving the PCF with SMF.PCF (LMA-10) with a certain length
spliced to SMF (Corning-28).
The spectrum of PCFI exhibits good sensitivity to humidity
variations. The PCFI response is observed for a range of humidity
values from (27% RH to 85% RH), the positi
Background: Bilastine (BLA) is a second-generation H1 antihistamine used to treat allergic rhinoconjunctivitis. Because of its limited solubility, it falls under class II of the Biopharmaceutics Classification System (BSC). The solid dispersion (SD) approach significantly improves the solubility and dissolution rate of insoluble medicines. Objective: To improve BLA solubility and dissolution rate by formulating a solid dispersion in the form of effervescent granules. Methods: To create BLA SDs, polyvinylpyrrolidone (PVP K30) and poloxamer 188 (PLX188) were mixed in various ratios (1:5, 1:10, and 1:15) using the kneading technique. All formulations were evaluated based on percent yield, drug content, and saturation solubility. The fo
... Show MoreOrthogonal polynomials and their moments serve as pivotal elements across various fields. Discrete Krawtchouk polynomials (DKraPs) are considered a versatile family of orthogonal polynomials and are widely used in different fields such as probability theory, signal processing, digital communications, and image processing. Various recurrence algorithms have been proposed so far to address the challenge of numerical instability for large values of orders and signal sizes. The computation of DKraP coefficients was typically computed using sequential algorithms, which are computationally extensive for large order values and polynomial sizes. To this end, this paper introduces a computationally efficient solution that utilizes the parall
... Show MoreFacial emotion recognition finds many real applications in the daily life like human robot interaction, eLearning, healthcare, customer services etc. The task of facial emotion recognition is not easy due to the difficulty in determining the effective feature set that can recognize the emotion conveyed within the facial expression accurately. Graph mining techniques are exploited in this paper to solve facial emotion recognition problem. After determining positions of facial landmarks in face region, twelve different graphs are constructed using four facial components to serve as a source for sub-graphs mining stage using gSpan algorithm. In each group, the discriminative set of sub-graphs are selected and fed to Deep Belief Network (DBN) f
... Show MoreIt has become necessary to change from a traditional system to an automated system in production processes, because it has high advantages. The most important of them is improving and increasing production. But there is still a need to improve and develop the work of these systems.
The objective of this work is to study time reduction by combining multiple sequences of operations into one process. To carry out this work, the pneumatic system is designed to decrease\ increase the time of the sequence that performs a pick and place process through optimizing the sequences based on the obstacle dimensions. Three axes are represented using pneumatic cylinders that move according to the sequence used. The system is implemented and con
... Show MoreEarthquakes occur on faults and create new faults. They also occur on normal, reverse and strike-slip faults. The aim of this work is to suggest a new unified classification of Shallow depth earthquakes based on the faulting styles, and to characterize each class. The characterization criteria include the maximum magnitude, focal depth, b-constant value, return period and relations between magnitude, focal depth and dip of fault plane. Global Centroid Moment Tensor (GCMT) catalog is the source of the used data. This catalog covers the period from Jan.1976 to Dec. 2017. We selected only the shallow (depth less than 70kms) pure, normal, strike-slip and reverse earthquakes (magnitude ≥ 5) and excluded the oblique earthquakes. Th
... Show More