We consider the problem of calibrating range measurements of a Light Detection and Ranging (lidar) sensor that is dealing with the sensor nonlinearity and heteroskedastic, range-dependent, measurement error. We solved the calibration problem without using additional hardware, but rather exploiting assumptions on the environment surrounding the sensor during the calibration procedure. More specifically we consider the assumption of calibrating the sensor by placing it in an environment so that its measurements lie in a 2D plane that is parallel to the ground. Then, its measurements come from fixed objects that develop orthogonally w.r.t. the ground, so that they may be considered as fixed points in an inertial reference frame. Moreover, we consider the intuition that moving the distance sensor within this environment implies that its measurements should be such that the relative distances and angles among the fixed points above remain the same. We thus exploit this intuition to cast the sensor calibration problem as making its measurements comply with this assumption that “fixed features shall have fixed relative distances and angles”. The resulting calibration procedure does thus not need to use additional (typically expensive) equipment, nor deploy special hardware. As for the proposed estimation strategies, from a mathematical perspective we consider models that lead to analytically solvable equations, so to enable deployment in embedded systems. Besides proposing the estimators we moreover analyze their statistical performance both in simulation and with field tests. We report the dependency of the MSE performance of the calibration procedure as a function of the sensor noise levels, and observe that in field tests the approach can lead to a tenfold improvement in the accuracy of the raw measurements.
Optimizing system performance in dynamic and heterogeneous environments and the efficient management of computational tasks are crucial. This paper therefore looks at task scheduling and resource allocation algorithms in some depth. The work evaluates five algorithms: Genetic Algorithms (GA), Particle Swarm Optimization (PSO), Ant Colony Optimization (ACO), Firefly Algorithm (FA) and Simulated Annealing (SA) across various workloads achieved by varying the task-to-node ratio. The paper identifies Finish Time and Deadline as two key performance metrics for gauging the efficacy of an algorithm, and a comprehensive investigation of the behaviors of these algorithms across different workloads was carried out. Results from the experiment
... Show MoreWith its rapid spread, the coronavirus infection shocked the world and had a huge effect on billions of peoples' lives. The problem is to find a safe method to diagnose the infections with fewer casualties. It has been shown that X-Ray images are an important method for the identification, quantification, and monitoring of diseases. Deep learning algorithms can be utilized to help analyze potentially huge numbers of X-Ray examinations. This research conducted a retrospective multi-test analysis system to detect suspicious COVID-19 performance, and use of chest X-Ray features to assess the progress of the illness in each patient, resulting in a "corona score." where the results were satisfactory compared to the benchmarked techniques. T
... Show MoreAlthough the number of stomach tumor patients reduced obviously during last decades in western countries, but this illness is still one of the main causes of death in developing countries. The aim of this research is to detect the area of a tumor in a stomach images based on fuzzy clustering. The proposed methodology consists of three stages. The stomach images are divided into four quarters and then features elicited from each quarter in the first stage by utilizing seven moments invariant. Fuzzy C-Mean clustering (FCM) was employed in the second stage for each quarter to collect the features of each quarter into clusters. Manhattan distance was calculated in the third stage among all clusters' centers in all quarters to disclosure of t
... Show MoreThe present work aims to study the efficiency of using aluminum refuse, which is available locally (after dissolving it in sodium hydroxide), with different coagulants like alum [Al2 (SO4)3.18H2O], Ferric chloride FeCl3 and polyaluminum chloride (PACl) to improve the quality of water. The results showed that using this coagulant in the flocculation process gave high results in the removal of turbidity as well as improving the quality of water by precipitating a great deal of ions causing hardness. From the experimental results of the Jar test, the optimum alum dosages are (25, 50 and 70 ppm), ferric chloride dosages are (15, 40 and 60 ppm) and polyaluminum chloride dosages were (10, 35 and 55 ppm) for initial water turbidity (100, 500 an
... Show MoreLoanwords are the words transferred from one language to another, which become essential part of the borrowing language. The loanwords have come from the source language to the recipient language because of many reasons. Detecting these loanwords is complicated task due to that there are no standard specifications for transferring words between languages and hence low accuracy. This work tries to enhance this accuracy of detecting loanwords between Turkish and Arabic language as a case study. In this paper, the proposed system contributes to find all possible loanwords using any set of characters either alphabetically or randomly arranged. Then, it processes the distortion in the pronunciation, and solves the problem of the missing lette
... Show MoreMetal oxide nanoparticles, including iron oxide, are highly considered as one of the most important species of nanomaterials in a varied range of applications due to their optical, magnetic, and electrical properties. Iron oxides are common compounds, extensive in nature, and easily synthesized in the laboratory. In this paper, iron oxide nanoparticles were prepared by co-precipitation of (Fe+2) and (Fe+3) ions, using iron (II and III) sulfate as precursor material and NH4OH solution as solvent at 90°C. After the synthesis of iron oxide particles, it was characterized using X-ray diffraction (XRD), infrared spectroscopy (FTIR), and scanning electron microscopy (SEM). These tests confirmed the obtaining o
... Show MoreEffective decision-making process is the basis for successfully solving any engineering problem. Many decisions taken in the construction projects differ in their nature due to the complex nature of the construction projects. One of the most crucial decisions that might result in numerous issues over the course of a construction project is the selection of the contractor. This study aims to use the ordinal priority approach (OPA) for the contractor selection process in the construction industry. The proposed model involves two computer programs; the first of these will be used to evaluate the decision-makers/experts in the construction projects, while the second will be used to formul