We consider the problem of calibrating range measurements of a Light Detection and Ranging (lidar) sensor that is dealing with the sensor nonlinearity and heteroskedastic, range-dependent, measurement error. We solved the calibration problem without using additional hardware, but rather exploiting assumptions on the environment surrounding the sensor during the calibration procedure. More specifically we consider the assumption of calibrating the sensor by placing it in an environment so that its measurements lie in a 2D plane that is parallel to the ground. Then, its measurements come from fixed objects that develop orthogonally w.r.t. the ground, so that they may be considered as fixed points in an inertial reference frame. Moreover, we consider the intuition that moving the distance sensor within this environment implies that its measurements should be such that the relative distances and angles among the fixed points above remain the same. We thus exploit this intuition to cast the sensor calibration problem as making its measurements comply with this assumption that “fixed features shall have fixed relative distances and angles”. The resulting calibration procedure does thus not need to use additional (typically expensive) equipment, nor deploy special hardware. As for the proposed estimation strategies, from a mathematical perspective we consider models that lead to analytically solvable equations, so to enable deployment in embedded systems. Besides proposing the estimators we moreover analyze their statistical performance both in simulation and with field tests. We report the dependency of the MSE performance of the calibration procedure as a function of the sensor noise levels, and observe that in field tests the approach can lead to a tenfold improvement in the accuracy of the raw measurements.
Change detection is a technology ascertaining the changes of
specific features within a certain time Interval. The use of remotely
sensed image to detect changes in land use and land cover is widely
preferred over other conventional survey techniques because this
method is very efficient for assessing the change or degrading trends
of a region. In this research two remotely sensed image of Baghdad
city gathered by landsat -7and landsat -8 ETM+ for two time period
2000 and 2014 have been used to detect the most important changes.
Registration and rectification the two original images are the first
preprocessing steps was applied in this paper. Change detection using
NDVI subtractive has been computed, subtrac
Hyperpigmentation is the increase in the natural color of the skin. The purpose of this study is to evaluate the efficacy and safety of Q-Switched Nd:YAG (1064 & 532 nm) Laser in treatment of skin hyper pigmentation. This study was done in the research clinic of Institute of laser for postgraduate Studies/University of Baghdad from October 2008 to the end of January 2009. After clinical assessment of skin hyperpigmentation color, twenty six patients were divided according to their lesions. Eight Patients with freckles, seven patients with melasma, four patients with tattoo. Cases with tattoo, were subdivided into amateur tattoos two, professional tattoos one, and one traumatic tattoo. Four Patients with post inflammatory hyperpigment
... Show MoreAstronomy image is regarded main source of information to discover outer space, therefore to know the basic contain for galaxy (Milky way), it was classified using Variable Precision Rough Sets technique to determine the different region within galaxy according different color in the image. From classified image we can determined the percentage for each class and then what is the percentage mean. In this technique a good classified image result and faster time required to done the classification process.
Bioethanol produced from lignocellulose feedstock is a renewable substitute to declining fossil fuels. Pretreatment using ultrasound assisted alkaline was investigated to enhance the enzyme digestibility of waste paper. The pretreatment was conducted over a wide range of conditions including waste paper concentrations of 1-5%, reaction time of 10-30 min and temperatures of 30-70°C. The optimum conditions were 4 % substrate loading with 25 min treatment time at 60°C where maximum reducing sugar obtained was 1.89 g/L. Hydrolysis process was conducted with a crude cellulolytic enzymes produced by Cellulomonas uda (PTCC 1259).The maximum amount of sugar released and hydrolysis efficiency were 20.92 g/L and 78.4 %, respectively. Sugars
... Show MoreBackground: The prediction of changes in the mandibular third molar position and eruption is an important clinical concern because third molar retention may be beneficial for orthodontic anchorage. The aims of this study were to assess the mandibular third molar position by using medical CT scan and lateral reconstructed radiograph and evaluate gender differences. Materials and Methods: The sample of present study consisted of 39 patients (18 males and 21 females) with age range 11-15 years who were attending at Al-Suwayra General Hospital/ the Computerized Tomography department. The distance from anterior edge of ramus to distal surface of permanent mandibular second molar and mesio-distal width of developing mandibular third molar were
... Show MoreExtraction of copper (Cu) from aqueous solution utilizing Liquid Membrane technology (LM) is more effective than precipitation method that forms sludge and must be disposed of in landfills. In this work, we have formulated a liquid surfactant membrane (LSM) that uses kerosene oil as the main diluent of LSM to remove copper ions from the aqueous waste solution through di- (2-ethylhexyl) phosphoric acid - D2EHPA- as a carrier. This technique displays several advantages including one-stage extraction and stripping process, simple operation, low energy requirement, and. In this study, the LSM process was used to transport Cu (II) ions from the feed phase to the stripping phase, which was prepared, using H2SO4. For LSM p
... Show MoreIn this research two algorithms are applied, the first is Fuzzy C Means (FCM) algorithm and the second is hard K means (HKM) algorithm to know which of them is better than the others these two algorithms are applied on a set of data collected from the Ministry of Planning on the water turbidity of five areas in Baghdad to know which of these areas are less turbid in clear water to see which months during the year are less turbid in clear water in the specified area.
Flexure members such as reinforced concrete (RC) simply supported beams subjected to two-point loading were analyzed numerically. The Extended Finite Element Method (XFEM) was employed for the treatment the non-smooth h behaviour such as discontinuities and singularities. This method is a powerful technique used for the analysis of the fracture process and crack propagation in concrete. Concrete is a heterogeneous material that consists of coarse aggregate, cement mortar and air voids distributed in the cement paste. Numerical modeling of concrete comprises a two-scale model, using mesoscale and macroscale numerical models. The effectiveness and validity of the Meso-Scale Approach (MSA) in modeling of the reinforced concrete beams w
... Show MoreA set of hydro treating experiments are carried out on vacuum gas oil in a trickle bed reactor to study the hydrodesulfurization and hydrodenitrogenation based on two model compounds, carbazole (non-basic nitrogen compound) and acridine (basic nitrogen compound), which are added at 0–200 ppm to the tested oil, and dibenzotiophene is used as a sulfur model compound at 3,000 ppm over commercial CoMo/ Al2O3 and prepared PtMo/Al2O3. The impregnation method is used to prepare (0.5% Pt) PtMo/Al2O3. The basic sites are found to be very small, and the two catalysts exhibit good metal support interaction. In the absence of nitrogen compounds over the tested catalysts in the trickle bed reactor at temperatures of 523 to 573 K, liquid hourly space v
... Show More