We consider the problem of calibrating range measurements of a Light Detection and Ranging (lidar) sensor that is dealing with the sensor nonlinearity and heteroskedastic, range-dependent, measurement error. We solved the calibration problem without using additional hardware, but rather exploiting assumptions on the environment surrounding the sensor during the calibration procedure. More specifically we consider the assumption of calibrating the sensor by placing it in an environment so that its measurements lie in a 2D plane that is parallel to the ground. Then, its measurements come from fixed objects that develop orthogonally w.r.t. the ground, so that they may be considered as fixed points in an inertial reference frame. Moreover, we consider the intuition that moving the distance sensor within this environment implies that its measurements should be such that the relative distances and angles among the fixed points above remain the same. We thus exploit this intuition to cast the sensor calibration problem as making its measurements comply with this assumption that “fixed features shall have fixed relative distances and angles”. The resulting calibration procedure does thus not need to use additional (typically expensive) equipment, nor deploy special hardware. As for the proposed estimation strategies, from a mathematical perspective we consider models that lead to analytically solvable equations, so to enable deployment in embedded systems. Besides proposing the estimators we moreover analyze their statistical performance both in simulation and with field tests. We report the dependency of the MSE performance of the calibration procedure as a function of the sensor noise levels, and observe that in field tests the approach can lead to a tenfold improvement in the accuracy of the raw measurements.
The problem of Multicollinearity is one of the most common problems, which deal to a large extent with the internal correlation between explanatory variables. This problem is especially Appear in economics and applied research, The problem of Multicollinearity has a negative effect on the regression model, such as oversized variance degree and estimation of parameters that are unstable when we use the Least Square Method ( OLS), Therefore, other methods were used to estimate the parameters of the negative binomial model, including the estimated Ridge Regression Method and the Liu type estimator, The negative binomial regression model is a nonline
... Show MoreThe free piston engine linear generator (FPELG) is a simple engine structure with few components, making it a promising power generation system. However, because the engine works without a crankshaft, the handling of the piston motion control (PMC) is the main challenge influencing the stability and performance of FPELGs. In this article, the optimal operating parameters of FPELG for maximising engine performance and reducing exhaust gas emissions were studied. Moreover, the influence of adding hydrogen (H2) to compressed natural gas (CNG) fuel on FPELG performance was investigated. The influence of operating parameters on in-cylinder pressure was also analysed. The single-piston FPELG fuelled by CNG blended with H2 was used to run the expe
... Show MoreAmong several separation processes, the air flotation distinguish as remarkably high potential separation process related to its high separation efficiency and throughput, energy-efficient, simple process, cost-effective, applicable to a wide range of oily wastewater and no by-products. The current study aimed to investigate the effect of the type and concentration of surfactant on the stability of oil-water emulsion and efficiency of the separation process. For this purpose, three types of surfactant where used (anionic SDS, mixed nonionic Span 85/Tween 80, and cationic CTAB). The results demonstrated that the Span 85/Tween 80 surfactant has the best stability, and it increases with the surfactant concentration augmentation. The removal ef
... Show MoreUnconfined Compressive Strength is considered the most important parameter of rock strength properties affecting the rock failure criteria. Various research have developed rock strength for specific lithology to estimate high-accuracy value without a core. Previous analyses did not account for the formation's numerous lithologies and interbedded layers. The main aim of the present study is to select the suitable correlation to predict the UCS for hole depth of formation without separating the lithology. Furthermore, the second aim is to detect an adequate input parameter among set wireline to determine the UCS by using data of three wells along ten formations (Tanuma, Khasib, Mishrif, Rumaila, Ahmady, Maudud, Nahr Um
... Show MoreA resume is the first impression between you and a potential employer. Therefore, the importance of a resume can never be underestimated. Selecting the right candidates for a job within a company can be a daunting task for recruiters when they have to review hundreds of resumes. To reduce time and effort, we can use NLTK and Natural Language Processing (NLP) techniques to extract essential data from a resume. NLTK is a free, open source, community-driven project and the leading platform for building Python programs to work with human language data. To select the best resume according to the company’s requirements, an algorithm such as KNN is used. To be selected from hundreds of resumes, your resume must be one of the best. Theref
... Show MoreThis research aims to solve the problem of selection using clustering algorithm, in this research optimal portfolio is formation using the single index model, and the real data are consisting from the stocks Iraqi Stock Exchange in the period 1/1/2007 to 31/12/2019. because the data series have missing values ,we used the two-stage missing value compensation method, the knowledge gap was inability the portfolio models to reduce The estimation error , inaccuracy of the cut-off rate and the Treynor ratio combine stocks into the portfolio that caused to decline in their performance, all these problems required employing clustering technic to data mining and regrouping it within clusters with similar characteristics to outperform the portfolio
... Show MoreIntrusion detection systems (IDS) are useful tools that help security administrators in the developing task to secure the network and alert in any possible harmful event. IDS can be classified either as misuse or anomaly, depending on the detection methodology. Where Misuse IDS can recognize the known attack based on their signatures, the main disadvantage of these systems is that they cannot detect new attacks. At the same time, the anomaly IDS depends on normal behaviour, where the main advantage of this system is its ability to discover new attacks. On the other hand, the main drawback of anomaly IDS is high false alarm rate results. Therefore, a hybrid IDS is a combination of misuse and anomaly and acts as a solution to overcome the dis
... Show More