Purpose: To compare the central corneal thickness (CCT),minimum corneal thickness (MCT) and corneal power measured using theScheimpflug-Placido device and optical coherence tomography (OCT) in healthy eyes. Study Design: Descriptive observational. Place and Duration of Study: Al-Kindy college of medicine/university of Baghdad, from June 2021 to April 2022. Methods: A total of 200 eyes of 200 individuals were enrolled in this study. CCT and MCT measurements were carried out using spectral-domain optical coherence tomography (Optovue) and a Scheimpflug-Placido topographer (Sirius).The agreement between the two approaches was assessed using Bland-Altman analysis in this study. Results: Mean age was 28.54 ± 6.6 years, mean spherical equivalent of refraction was -3.57 ± 3.35 D. Mean CCT by Optovue, and Sirius were534.13 ± 27.88 μm, and 540.2 ± 27.85μm, respectively.Mean CCT differences between them were -6.070± 6.593 μm, (p < 0.05). Minimum thickness by Optovue was 526.79 ± 27.81, and by Sirius was 537.44 ± 27.56, mean difference between the two devices was 10.66 ± 6.89,p= 0.00. The net power by OCT was 43.44 ± 1.456, mean K by Sirius was 43.597 ± 1.408, with p=0.000. Maximum level of agreement between the two devices is -18.99 to 6.85 for CCT, is widest for minimum thickness -24.166 to 2.85 and narrowest for differences between net corneal power by OCT and mean K By Sirius is -0.87 to 1.18. Conclusion: In clinical practice, the two devices cannot be used interchangeably. CCT and keratometry should be evaluated and followed up using the same device.
Big data analysis has important applications in many areas such as sensor networks and connected healthcare. High volume and velocity of big data bring many challenges to data analysis. One possible solution is to summarize the data and provides a manageable data structure to hold a scalable summarization of data for efficient and effective analysis. This research extends our previous work on developing an effective technique to create, organize, access, and maintain summarization of big data and develops algorithms for Bayes classification and entropy discretization of large data sets using the multi-resolution data summarization structure. Bayes classification and data discretization play essential roles in many learning algorithms such a
... Show MoreThe research involves preparing gold nanoparticles (AuNPs) and studying the factors that influence the shape, sizes and distribution ratio of the prepared particles according to Turkevich method. These factors include (reaction temperature, initial heating, concentration of gold ions, concentration and quantity of added citrate, reaction time and order of reactant addition). Gold nanoparticles prepared were characterized by the following measurements: UV-Visible spectroscopy, X-ray diffraction and scanning electron microscopy. The average size of gold nanoparticles was formed in the range (20 -35) nm. The amount of added citrate was changed and studied. In addition, the concentration of added gold ions was changed and the calibration cur
... Show MoreThis work implements the face recognition system based on two stages, the first stage is feature extraction stage and the second stage is the classification stage. The feature extraction stage consists of Self-Organizing Maps (SOM) in a hierarchical format in conjunction with Gabor Filters and local image sampling. Different types of SOM’s were used and a comparison between the results from these SOM’s was given.
The next stage is the classification stage, and consists of self-organizing map neural network; the goal of this stage is to find the similar image to the input image. The proposal method algorithm implemented by using C++ packages, this work is successful classifier for a face database consist of 20
... Show MoreIn this paper, the process of comparison between the tree regression model and the negative binomial regression. As these models included two types of statistical methods represented by the first type "non parameter statistic" which is the tree regression that aims to divide the data set into subgroups, and the second type is the "parameter statistic" of negative binomial regression, which is usually used when dealing with medical data, especially when dealing with large sample sizes. Comparison of these methods according to the average mean squares error (MSE) and using the simulation of the experiment and taking different sample
... Show MoreKinematics is the mechanics branch which dealswith the movement of the bodies without taking the force into account. In robots, the forward kinematics and inverse kinematics are important in determining the position and orientation of the end-effector to perform multi-tasks. This paper presented the inverse kinematics analysis for a 5 DOF robotic arm using the robotics toolbox of MATLAB and the Denavit-Hartenberg (D-H) parameters were used to represent the links and joints of the robotic arm. A geometric approach was used in the inverse kinematics solution to determine the joints angles of the robotic arm and the path of the robotic arm was divided into successive lines to accomplish the required tasks of the robotic arm.Therefore, this
... Show MoreThe effect of different antibiotics on growth pigment and plasmid curing of Serratia marcescens were studied, S. marcescens was cultured in media containing(16_500)µg/ml of antibiotics, curing mutants unable to produce prodigiosin and lost one plasmid band were obtained of of ampicillin, amoxillin, antibiotics concentrations (64 500) µg/ml metheprim, ultracloxam, azithromycin, cephalexin and erythromycin treated with (350 500) µg/ml of The mutant cells rose- light color and and refampicin revealed S.marcescens inhibited ciprodar and tetracyclin, lincomycin did not lost the plasmid band chlaforan
The assessment of data quality from different sources can be considered as a key challenge in supporting effective geospatial data integration and promoting collaboration in mapping projects. This paper presents a methodology for assessing positional and shape quality for authoritative large-scale data, such as Ordnance Survey (OS) UK data and General Directorate for Survey (GDS) Iraq data, and Volunteered Geographic Information (VGI), such as OpenStreetMap (OSM) data, with the intention of assessing possible integration. It is based on the measurement of discrepancies among the datasets, addressing positional accuracy and shape fidelity, using standard procedures and also directional statistics. Line feature comparison has been und
... Show More