We consider the problem of calibrating range measurements of a Light Detection and Ranging (lidar) sensor that is dealing with the sensor nonlinearity and heteroskedastic, range-dependent, measurement error. We solved the calibration problem without using additional hardware, but rather exploiting assumptions on the environment surrounding the sensor during the calibration procedure. More specifically we consider the assumption of calibrating the sensor by placing it in an environment so that its measurements lie in a 2D plane that is parallel to the ground. Then, its measurements come from fixed objects that develop orthogonally w.r.t. the ground, so that they may be considered as fixed points in an inertial reference frame. Moreover, we consider the intuition that moving the distance sensor within this environment implies that its measurements should be such that the relative distances and angles among the fixed points above remain the same. We thus exploit this intuition to cast the sensor calibration problem as making its measurements comply with this assumption that “fixed features shall have fixed relative distances and angles”. The resulting calibration procedure does thus not need to use additional (typically expensive) equipment, nor deploy special hardware. As for the proposed estimation strategies, from a mathematical perspective we consider models that lead to analytically solvable equations, so to enable deployment in embedded systems. Besides proposing the estimators we moreover analyze their statistical performance both in simulation and with field tests. We report the dependency of the MSE performance of the calibration procedure as a function of the sensor noise levels, and observe that in field tests the approach can lead to a tenfold improvement in the accuracy of the raw measurements.
It is the regression analysis is the foundation stone of knowledge of statistics , which mostly depends on the ordinary least square method , but as is well known that the way the above mentioned her several conditions to operate accurately and the results can be unreliable , add to that the lack of certain conditions make it impossible to complete the work and analysis method and among those conditions are the multi-co linearity problem , and we are in the process of detected that problem between the independent variables using farrar –glauber test , in addition to the requirement linearity data and the lack of the condition last has been resorting to the
... Show MoreFacial recognition has been an active field of imaging science. With the recent progresses in computer vision development, it is extensively applied in various areas, especially in law enforcement and security. Human face is a viable biometric that could be effectively used in both identification and verification. Thus far, regardless of a facial model and relevant metrics employed, its main shortcoming is that it requires a facial image, against which comparison is made. Therefore, closed circuit televisions and a facial database are always needed in an operational system. For the last few decades, unfortunately, we have experienced an emergence of asymmetric warfare, where acts of terrorism are often committed in secluded area with no
... Show MoreWireless sensor network (WSN) security is an important component for protecting data from an attacker. For improving security, cryptography technologies are divided into two kinds: symmetric and asymmetric. Therefore, the implementation of protocols for generating a secret key takes a long time in comparison to the sensor’s limitations, which decrease network throughput because they are based on an asymmetric method. The asymmetric algorithms are complex and decrease network throughput. In this paper, an encryption symmetric secret key in wireless sensor networks (WSN) is proposed. In this work, 24 experiments are proposed, which are encryption using the AES algorithm in the cases of 1 key, 10 keys, 25 keys, and 50 keys. I
... Show MoreWith the escalation of cybercriminal activities, the demand for forensic investigations into these crimeshas grown significantly. However, the concept of systematic pre-preparation for potential forensicexaminations during the software design phase, known as forensic readiness, has only recently gainedattention. Against the backdrop of surging urban crime rates, this study aims to conduct a rigorous andprecise analysis and forecast of crime rates in Los Angeles, employing advanced Artificial Intelligence(AI) technologies. This research amalgamates diverse datasets encompassing crime history, varioussocio-economic indicators, and geographical locations to attain a comprehensive understanding of howcrimes manifest within the city. Lev
... Show MoreEfficient and cost-effective drilling of directional wells necessitates the implementation of best drilling practices and advanced techniques to optimize drilling operations. Failure to adequately consider drilling risks can result in inefficient drilling operations and non-productive time (NPT). Although advanced drilling techniques may be expensive, they offer promising technical solutions for mitigating drilling risks. This paper aims to demonstrate the effectiveness of advanced drilling techniques in mitigating risks and improving drilling operations when compared to conventional drilling techniques. Specifically, the advanced drilling techniques employed in Buzurgan Oil Field, including vertical drilling with mud motor, managed pres
... Show MoreInstitutions and companies are looking to reduce spending on buildings and services according to scientific methods, provided they reach the same purpose but at a lower cost. On this basis, this paper proposes a model to measure and reduce maintenance costs in one of the public sector institutions in Iraq by using performance indicators that fit the nature of the work of this institution and the available data. The paper relied on studying the nature of the institution’s work in the maintenance field and looking at the type of data available to know the type and number of appropriate indicators to create the model. Maintenance data were collected for the previous six years by reviewing the maintenance and financial dep
... Show MoreThis research deals with the use of a number of statistical methods, such as the kernel method, watershed, histogram, and cubic spline, to improve the contrast of digital images. The results obtained according to the RSME and NCC standards have proven that the spline method is the most accurate in the results compared to other statistical methods.
There are many events that took place in Al Mosul province between 2013 and 2018. These events led to many changes in the area under study. These changes involved a decrease in agricultural crops and water due to the population leaving the area. Therefore, it is imperative that planners, decision-makers, and development officials intervene in order to restore the region's activity in terms of environment and agriculture. The aim of this research is to use remote sensing (RS) technique and geographic information system (GIS) to detect the change that occurred in the mentioned period. This was achieved through the use of the ArcGIS software package for the purpose of assessing the state of lands of agricultural crops and
... Show MoreTypographic patterns are one of the design elements in commercial advertising for their ability to deliver the message and information to the recipient smoothly and quickly, and it is indicated that there are many different techniques that can use typographic patterns in commercial advertisements, including spacing, spaces between letters, letter height, length, weight, and contrast and this Usage must be studied according to the type of font and how it can be used in advertising campaigns.
Based on the above, the research came to study (employing typographic patterns in commercial advertising design) in which the researcher identified his question for the purpose of reaching a solution to his research problem which is (Is it possible