We consider the problem of calibrating range measurements of a Light Detection and Ranging (lidar) sensor that is dealing with the sensor nonlinearity and heteroskedastic, range-dependent, measurement error. We solved the calibration problem without using additional hardware, but rather exploiting assumptions on the environment surrounding the sensor during the calibration procedure. More specifically we consider the assumption of calibrating the sensor by placing it in an environment so that its measurements lie in a 2D plane that is parallel to the ground. Then, its measurements come from fixed objects that develop orthogonally w.r.t. the ground, so that they may be considered as fixed points in an inertial reference frame. Moreover, we consider the intuition that moving the distance sensor within this environment implies that its measurements should be such that the relative distances and angles among the fixed points above remain the same. We thus exploit this intuition to cast the sensor calibration problem as making its measurements comply with this assumption that “fixed features shall have fixed relative distances and angles”. The resulting calibration procedure does thus not need to use additional (typically expensive) equipment, nor deploy special hardware. As for the proposed estimation strategies, from a mathematical perspective we consider models that lead to analytically solvable equations, so to enable deployment in embedded systems. Besides proposing the estimators we moreover analyze their statistical performance both in simulation and with field tests. We report the dependency of the MSE performance of the calibration procedure as a function of the sensor noise levels, and observe that in field tests the approach can lead to a tenfold improvement in the accuracy of the raw measurements.
This investigation was carried out to study the treatment and recycling of wastewater in the cotton textile industry for an effluent containing three dyes: direct blue, sulphur black and vat yellow. The reuse of such effluent can only be made possible by appropriate treatment method such as chemical coagulation. Ferrous and ferric sulphate with and without calcium hydroxide were employed in this study as the chemical coagulants.
The results showed that the percentage removal of direct blue ranged between 91.4 and 94 , for sulphur black ranged between 98.7 and 99.5 while for vat yellow it was between 97 and 99.
In this paper, new method have been investigated using evolving algorithms (EA's) to cryptanalysis one of the nonlinear stream cipher cryptosystems which depends on the Linear Feedback Shift Register (LFSR) unit by using cipher text-only attack. Genetic Algorithm (GA) and Ant Colony Optimization (ACO) which are used for attacking one of the nonlinear cryptosystems called "shrinking generator" using different lengths of cipher text and different lengths of combined LFSRs. GA and ACO proved their good performance in finding the initial values of the combined LFSRs. This work can be considered as a warning for a stream cipher designer to avoid the weak points, which may be f
... Show MoreOver the past few years, ear biometrics has attracted a lot of attention. It is a trusted biometric for the identification and recognition of humans due to its consistent shape and rich texture variation. The ear presents an attractive solution since it is visible, ear images are easily captured, and the ear structure remains relatively stable over time. In this paper, a comprehensive review of prior research was conducted to establish the efficacy of utilizing ear features for individual identification through the employment of both manually-crafted features and deep-learning approaches. The objective of this model is to present the accuracy rate of person identification systems based on either manually-crafted features such as D
... Show MoreThe area of character recognition has received a considerable attention by researchers all over the world during the last three decades. However, this research explores best sets of feature extraction techniques and studies the accuracy of well-known classifiers for Arabic numeral using the Statistical styles in two methods and making comparison study between them. First method Linear Discriminant function that is yield results with accuracy as high as 90% of original grouped cases correctly classified. In the second method, we proposed algorithm, The results show the efficiency of the proposed algorithms, where it is found to achieve recognition accuracy of 92.9% and 91.4%. This is providing efficiency more than the first method.
: Sound forecasts are essential elements of planning, especially for dealing with seasonality, sudden changes in demand levels, strikes, large fluctuations in the economy, and price-cutting manoeuvres for competition. Forecasting can help decision maker to manage these problems by identifying which technologies are appropriate for their needs. The proposal forecasting model is utilized to extract the trend and cyclical component individually through developing the Hodrick–Prescott filter technique. Then, the fit models of these two real components are estimated to predict the future behaviour of electricity peak load. Accordingly, the optimal model obtained to fit the periodic component is estimated using spectrum analysis and Fourier mod
... Show MoreSmart water flooding (low salinity water flooding) was mainly invested in a sandstone reservoir. The main reasons for using low salinity water flooding are; to improve oil recovery and to give a support for the reservoir pressure.
In this study, two core plugs of sandstone were used with different permeability from south of Iraq to explain the effect of water injection with different ions concentration on the oil recovery. Water types that have been used are formation water, seawater, modified low salinity water, and deionized water.
The effects of water salinity, the flow rate of water injected, and the permeability of core plugs have been studied in order to summarize the best conditions of low salinity
... Show MoreIn this paper, we used four classification methods to classify objects and compareamong these methods, these are K Nearest Neighbor's (KNN), Stochastic Gradient Descentlearning (SGD), Logistic Regression Algorithm(LR), and Multi-Layer Perceptron (MLP). Weused MCOCO dataset for classification and detection the objects, these dataset image wererandomly divided into training and testing datasets at a ratio of 7:3, respectively. In randomlyselect training and testing dataset images, converted the color images to the gray level, thenenhancement these gray images using the histogram equalization method, resize (20 x 20) fordataset image. Principal component analysis (PCA) was used for feature extraction, andfinally apply four classification metho
... Show More