We consider the problem of calibrating range measurements of a Light Detection and Ranging (lidar) sensor that is dealing with the sensor nonlinearity and heteroskedastic, range-dependent, measurement error. We solved the calibration problem without using additional hardware, but rather exploiting assumptions on the environment surrounding the sensor during the calibration procedure. More specifically we consider the assumption of calibrating the sensor by placing it in an environment so that its measurements lie in a 2D plane that is parallel to the ground. Then, its measurements come from fixed objects that develop orthogonally w.r.t. the ground, so that they may be considered as fixed points in an inertial reference frame. Moreover, we consider the intuition that moving the distance sensor within this environment implies that its measurements should be such that the relative distances and angles among the fixed points above remain the same. We thus exploit this intuition to cast the sensor calibration problem as making its measurements comply with this assumption that “fixed features shall have fixed relative distances and angles”. The resulting calibration procedure does thus not need to use additional (typically expensive) equipment, nor deploy special hardware. As for the proposed estimation strategies, from a mathematical perspective we consider models that lead to analytically solvable equations, so to enable deployment in embedded systems. Besides proposing the estimators we moreover analyze their statistical performance both in simulation and with field tests. We report the dependency of the MSE performance of the calibration procedure as a function of the sensor noise levels, and observe that in field tests the approach can lead to a tenfold improvement in the accuracy of the raw measurements.
Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show MoreVariable selection is an essential and necessary task in the statistical modeling field. Several studies have triedto develop and standardize the process of variable selection, but it isdifficultto do so. The first question a researcher needs to ask himself/herself what are the most significant variables that should be used to describe a given dataset’s response. In thispaper, a new method for variable selection using Gibbs sampler techniqueshas beendeveloped.First, the model is defined, and the posterior distributions for all the parameters are derived.The new variable selection methodis tested usingfour simulation datasets. The new approachiscompared with some existingtechniques: Ordinary Least Squared (OLS), Least Absolute Shrinkage
... Show MoreGreen nanotechnology is a thrilling and rising place of technology and generation that bracesthe ideas of inexperienced chemistry with ability advantages for sustainability, protection, andthe general protection from the race human. The inexperienced chemistry method introduces aproper technique for the production, processing, and alertness of much less dangerous chemicalsubstances to lessen threats to human fitness and the environment. The technique calls for inintensity expertise of the uncooked materials, particularly in phrases in their creation intonanomaterials and the resultant bioactivities that pose very few dangerous outcomes for peopleand the environment. In the twenty-first century, nanotechnology has become a systematic
... Show MoreTo damp the low-frequency oscillations which occurred due to the disturbances in the electrical power system, the generators are equipped with Power System Stabilizer (PSS) that provide supplementary feedback stabilizing signals. The low-frequency oscillations in power system are classified as local mode oscillations, intra-area mode oscillation, and interarea mode oscillations. Double input multiband Power system stabilizers (PSSs) were used to damp out low-frequency oscillations in power system. Among dual-input PSSs, PSS4B offers superior transient performance. Power system simulator for engineering (PSS/E) software was adopted to test and evaluate the dynamic performance of PSS4B model on Iraqi national grid. The res
... Show MoreCost estimation is considered one of the important tasks in the construction projects management. The precise estimation of the construction cost affect on the success and quality of a construction project. Elemental estimation is considered a very important stage to the project team because it represents one of the key project elements. It helps in formulating the basis to strategies and execution plans for construction and engineering. Elemental estimation, which in the early stage, estimates the construction costs depending on . minimum details of the project so that it gives an indication for the initial design stage of a project. This paper studies the factors that affect the elemental cost estimation as well as the rela
... Show MoreIn this paper, experimental study has been done for temperature distribution in space conditioned with Ventilation Hollow Core Slab (TermoDeck) system. The experiments were carried out on a model room with dimensions of (1m 1.2m 1m) that was built according to a suitable scale factor of (1/4). The temperature distributions was measured by 59 thermocouples fixed in several locations in the test room. Two cases were considered in this work, the first one during unoccupied period at night time (without external load) and the other at day period with external load of 800W/m2 according to solar heat gain calculations during summer season in Iraq. All results confirm the use of TermoDeck system for ventilation and cooling/heat
... Show MoreOptical fiber chemical sensor based surface Plasmon resonance for sensing and measuring the refractive index and concentration for Acetic acid is designed and implemented during this work. Optical grade plastic optical fibers with a diameter of 1000μm were used with a diameter core of 980μm and a cladding of 20μm, where the sensor is fabricated by a small part (10mm) of optical fiber in the middle is embedded in a resin block and then the polishing process is done, after that it is deposited with about (40nm) thickness of gold metal and the Acetic acid is placed on the sensing probe.
Machine learning has a significant advantage for many difficulties in the oil and gas industry, especially when it comes to resolving complex challenges in reservoir characterization. Permeability is one of the most difficult petrophysical parameters to predict using conventional logging techniques. Clarifications of the work flow methodology are presented alongside comprehensive models in this study. The purpose of this study is to provide a more robust technique for predicting permeability; previous studies on the Bazirgan field have attempted to do so, but their estimates have been vague, and the methods they give are obsolete and do not make any concessions to the real or rigid in order to solve the permeability computation. To
... Show MoreThe complexity and variety of language included in policy and academic documents make the automatic classification of research papers based on the United Nations Sustainable Development Goals (SDGs) somewhat difficult. Using both pre-trained and contextual word embeddings to increase semantic understanding, this study presents a complete deep learning pipeline combining Bidirectional Long Short-Term Memory (BiLSTM) and Convolutional Neural Network (CNN) architectures which aims primarily to improve the comprehensibility and accuracy of SDG text classification, thereby enabling more effective policy monitoring and research evaluation. Successful document representation via Global Vector (GloVe), Bidirectional Encoder Representations from Tra
... Show MoreThis research proposes the application of the dragonfly and fruit fly algorithms to enhance estimates generated by the Fama-MacBeth model and compares their performance in this context for the first time. To specifically improve the dragonfly algorithm's effectiveness, three parameter tuning approaches are investigated: manual parameter tuning (MPT), adaptive tuning by methodology (ATY), and a novel technique called adaptive tuning by performance (APT). Additionally, the study evaluates the estimation performance using kernel weighted regression (KWR) and explores how the dragonfly and fruit fly algorithms can be employed to enhance KWR. All methods are tested using data from the Iraq Stock Exchange, based on the Fama-French three-f
... Show More