Abstract. Full-waveform airborne laser scanning data has shown its potential to enhance available segmentation and classification approaches through the additional information it can provide. However, this additional information is unable to directly provide a valid physical representation of surface features due to many variables affecting the backscattered energy during travel between the sensor and the target. Effectively, this delivers a mis-match between signals from overlapping flightlines. Therefore direct use of this information is not recommended without the adoption of a comprehensive radiometric calibration strategy that accounts for all these effects. This paper presents a practical and reliable radiometric calibration routine by accounting for all the variables affecting the backscattered energy, including the essential factor of angle of incidence. A new robust incidence angle estimation approach has been developed which has proven capable of delivering a reliable estimation for the scattering direction of the individual echoes. The routine was tested and validated both visually and statistically over various land cover types with simple and challenging surface trends. This proved the validity of this approach to deliver the optimal match between overlapping flightlines after calibration, particularly by adopting a parameter which accounts for the angle of incidence effect.
Authentication is the process of determining whether someone or something is, in fact, who or what it is declared to be. As the dependence upon computers and computer networks grows, the need for user authentication has increased. User’s claimed identity can be verified by one of several methods. One of the most popular of these methods is represented by (something user know), such as password or Personal Identification Number (PIN). Biometrics is the science and technology of authentication by identifying the living individual’s physiological or behavioral attributes. Keystroke authentication is a new behavioral access control system to identify legitimate users via their typing behavior. The objective of this paper is to provide user
... Show MoreThe quality of Global Navigation Satellite Systems (GNSS) networks are considerably influenced by the configuration of the observed baselines. Where, this study aims to find an optimal configuration for GNSS baselines in terms of the number and distribution of baselines to improve the quality criteria of the GNSS networks. First order design problem (FOD) was applied in this research to optimize GNSS network baselines configuration, and based on sequential adjustment method to solve its objective functions.
FOD for optimum precision (FOD-p) was the proposed model which based on the design criteria of A-optimality and E-optimality. These design criteria were selected as objective functions of precision, whic
... Show MoreToday with increase using social media, a lot of researchers have interested in topic extraction from Twitter. Twitter is an unstructured short text and messy that it is critical to find topics from tweets. While topic modeling algorithms such as Latent Semantic Analysis (LSA) and Latent Dirichlet Allocation (LDA) are originally designed to derive topics from large documents such as articles, and books. They are often less efficient when applied to short text content like Twitter. Luckily, Twitter has many features that represent the interaction between users. Tweets have rich user-generated hashtags as keywords. In this paper, we exploit the hashtags feature to improve topics learned
Speech is the essential way to interact between humans or between human and machine. However, it is always contaminated with different types of environment noise. Therefore, speech enhancement algorithms (SEA) have appeared as a significant approach in speech processing filed to suppress background noise and return back the original speech signal. In this paper, a new efficient two-stage SEA with low distortion is proposed based on minimum mean square error sense. The estimation of clean signal is performed by taking the advantages of Laplacian speech and noise modeling based on orthogonal transform (Discrete Krawtchouk-Tchebichef transform) coefficients distribution. The Discrete Kra
Photonic Crystal Fiber Interferometers (PCFIs) are greatly used
for sensing applications. This work presents the fabrication and
characterization of a relative humidity sensor based on Mach-
Zehnder Interferometer (MZI), which operates in reflection mode.
The humidity sensor operation based on the adsorption and
desorption of water vapour at the silica-air interface within the PCF.
The fabrication of this sensor is simple, it only includes splicing and
cleaving the PCF with SMF.PCF (LMA-10) with a certain length
spliced to SMF (Corning-28).
The spectrum of PCFI exhibits good sensitivity to humidity
variations. The PCFI response is observed for a range of humidity
values from (27% RH to 85% RH), the positi
B3LYP/6-31G, DFT method was applied to hypothetical study the design of six carbon nanotube materials based on [8]circulene, through the use of cyclic polymerization of two and three molecules of [8]circulene. Optimized structures of [8]circulene have saddle-shaped. Design of six carbon nanotubes reactions were done by thermodynamically calculating (Δ S, Δ G and Δ H) and the stability of these hypothetical nanotubes depending on the value of HOMO energy level. Nanotubes obtained have the most efficient gap energy, making them potentially useful for solar cell applications.