We consider the problem of calibrating range measurements of a Light Detection and Ranging (lidar) sensor that is dealing with the sensor nonlinearity and heteroskedastic, range-dependent, measurement error. We solved the calibration problem without using additional hardware, but rather exploiting assumptions on the environment surrounding the sensor during the calibration procedure. More specifically we consider the assumption of calibrating the sensor by placing it in an environment so that its measurements lie in a 2D plane that is parallel to the ground. Then, its measurements come from fixed objects that develop orthogonally w.r.t. the ground, so that they may be considered as fixed points in an inertial reference frame. Moreover, we consider the intuition that moving the distance sensor within this environment implies that its measurements should be such that the relative distances and angles among the fixed points above remain the same. We thus exploit this intuition to cast the sensor calibration problem as making its measurements comply with this assumption that “fixed features shall have fixed relative distances and angles”. The resulting calibration procedure does thus not need to use additional (typically expensive) equipment, nor deploy special hardware. As for the proposed estimation strategies, from a mathematical perspective we consider models that lead to analytically solvable equations, so to enable deployment in embedded systems. Besides proposing the estimators we moreover analyze their statistical performance both in simulation and with field tests. We report the dependency of the MSE performance of the calibration procedure as a function of the sensor noise levels, and observe that in field tests the approach can lead to a tenfold improvement in the accuracy of the raw measurements.
Recently, the internet has made the users able to transmit the digital media in the easiest manner. In spite of this facility of the internet, this may lead to several threats that are concerned with confidentiality of transferred media contents such as media authentication and integrity verification. For these reasons, data hiding methods and cryptography are used to protect the contents of digital media. In this paper, an enhanced method of image steganography combined with visual cryptography has been proposed. A secret logo (binary image) of size (128x128) is encrypted by applying (2 out 2 share) visual cryptography on it to generate two secret share. During the embedding process, a cover red, green, and blue (RGB) image of size (512
... Show MoreRadiotherapy is medical use of ionizing radiation, and commonly applied to the cancerous tumor because of its ability to control cell growth. The amount of radiation used in photon radiation therapy called dose (measured in grey unit), which depend on the type and stage of cancer being treated. In our work, we studied the dose distribution given to the tumor at different depths (zero-20 cm) treated with different field size (4×4- 23×23 cm). Results show that the deeper treated area has less dose rate at the same beam quality and quantity. Also it has been noted increasing in the field increasing in the depth dose at the same depth even if the radiation energy is constant. Increasing in radiation dose attributed to the scattere
... Show MoreAbstract: Stars whose initial masses are between (0.89 - 8.0) M☉ go through an Asymptotic Giant Branch (AGB) phase at the end of their life. Which have been evolved from the main sequence phase through Asymptotic Giant Branch (AGB). The calculations were done by adopted Synthetic Model showed the following results: 1- Mass loss on the AGB phase consists of two phases for period (P <500) days and for (P>500) days; 2- the mass loss rate exponentially increases with the pulsation periods; 3- The expansion velocity VAGB for our stars are calculated according to the three assumptions; 4- the terminal velocity depends on several factors likes metallicity and luminosity. The calculations indicated that a super wind phase (S.W) developed on the A
... Show MorePsychological research centers help indirectly contact professionals from the fields of human life, job environment, family life, and psychological infrastructure for psychiatric patients. This research aims to detect job apathy patterns from the behavior of employee groups in the University of Baghdad and the Iraqi Ministry of Higher Education and Scientific Research. This investigation presents an approach using data mining techniques to acquire new knowledge and differs from statistical studies in terms of supporting the researchers’ evolving needs. These techniques manipulate redundant or irrelevant attributes to discover interesting patterns. The principal issue identifies several important and affective questions taken from
... Show MoreNowadays, the robotic arm is fast becoming the most popular robotic form used in the industry among others. Therefore, the issues regarding remote monitoring and controlling system are very important, which measures different environmental parameters at a distance away from the room and sets various condition for a desired environment through a wireless communication system operated from a central room. Thus, it is crucial to create a programming system which can control the movement of each part of the industrial robot in order to ensure it functions properly. EDARM ED-7100 is one of the simplest models of the robotic arm, which has a manual controller to control the movement of the robotic arm. In order to improve this control s
... Show More
In this paper, the ability of using corn leaves as low-cost natural biowaste adsorbent material for the removal of Indigo Carmen (IC) dye was studied. Batch mode system was used to study several parameters such as, contact time (4 days), concentration of dye (10-50) ppm, adsorbent dosage (0.05-0.25) gram, pH (2-12) and temperature (30-60) oC. The corn leaf was characterized by Fourier-transform infrared spectroscopy device before and after the adsorption process of the IC dye and scanning electron microscope device was used to find the morphology of the adsorbent material. The experimental data was imputing with several isotherms where it fits with Freundlich (R2 = 0.9
... Show MoreThe use of biopolymer material Chitosan impregnated granular activated carbon CHGAC as adsorbent in the removal of lead ions pb.2+ from aqueous solution was studied using batch adsorption mode. The prepared CHGAC was characterized by Scanning Electronic Microscopy (SEM) and atomic-absorption pectrophotometer. The adsorption of lead ions onto Chitosan-impregnated granular activated carbon was examined as a function of adsorbent weight, pH and
contact time in Batch system. Langmuir and Freundlich models were employed to analyze the resulting experimental data demonstrated that better fitted by Langmuir isotherm model than Freundlich model, with good correlation coefficient. The maximum adsorption capacity calculated f
In this paper, suggested formula as well a conventional method for estimating the twoparameters (shape and scale) of the Generalized Rayleigh Distribution was proposed. For different sample sizes (small, medium, and large) and assumed several contrasts for the two parameters a percentile estimator was been used. Mean Square Error was implemented as an indicator of performance and comparisons of the performance have been carried out through data analysis and computer simulation between the suggested formulas versus the studied formula according to the applied indicator. It was observed from the results that the suggested method which was performed for the first time (as far as we know), had highly advantage than t
... Show MoreBig data analysis has important applications in many areas such as sensor networks and connected healthcare. High volume and velocity of big data bring many challenges to data analysis. One possible solution is to summarize the data and provides a manageable data structure to hold a scalable summarization of data for efficient and effective analysis. This research extends our previous work on developing an effective technique to create, organize, access, and maintain summarization of big data and develops algorithms for Bayes classification and entropy discretization of large data sets using the multi-resolution data summarization structure. Bayes classification and data discretization play essential roles in many learning algorithms such a
... Show More