Abstract. Full-waveform airborne laser scanning data has shown its potential to enhance available segmentation and classification approaches through the additional information it can provide. However, this additional information is unable to directly provide a valid physical representation of surface features due to many variables affecting the backscattered energy during travel between the sensor and the target. Effectively, this delivers a mis-match between signals from overlapping flightlines. Therefore direct use of this information is not recommended without the adoption of a comprehensive radiometric calibration strategy that accounts for all these effects. This paper presents a practical and reliable radiometric calibration routine by accounting for all the variables affecting the backscattered energy, including the essential factor of angle of incidence. A new robust incidence angle estimation approach has been developed which has proven capable of delivering a reliable estimation for the scattering direction of the individual echoes. The routine was tested and validated both visually and statistically over various land cover types with simple and challenging surface trends. This proved the validity of this approach to deliver the optimal match between overlapping flightlines after calibration, particularly by adopting a parameter which accounts for the angle of incidence effect.
Purpose: to demonstrate the possibility of moving to electronic data exchange dimensions (regulatory requirements, technical requirements, human requirements, senior management support) to simplify the work procedures dimensions (modern procedures, clarity of procedures, short procedures, availability of information and means required. The simplicity of the models used because of its importance to keep abreast of recent developments in the service of municipal works through the application of electronic data interchange, which simplifies procedures and out of the routine in the performance of the work of municipal departments has developed. It was applied to Municipality (Hilla) so that the transformation
... Show MoreThe Arabic grammatical theory is characterized by the characteristics that distinguish it from other languages. It is based on the following equation: In its entirety a homogeneous linguistic system that blends with the social nature of the Arab, his beliefs, and his culture.
This means that this theory was born naturally, after the labor of maintaining an integrated inheritance, starting with its legal text (the Koran), and ends with its features of multiple attributes.
Saber was carrying the founding crucible of that theory, which takes over from his teacher, Hebron, to be built on what it has reached. It is redundant to point to his location and the status of his book.
So came to my research tagged: (c
A band rationing method is applied to calculate the salinity index (SI) and Normalized Multi-Band Drought Index (NMDI) as pre-processing to take Agriculture decision in these areas is presented. To separate the land from other features that exist in the scene, the classical classification method (Maximum likelihood classification) is used by classified the study area to multi classes (Healthy vegetation (HV), Grasslands (GL), Water (W), Urban (U), Bare Soil (BS)). A Landsat 8 satellite image of an area in the south of Iraq are used, where the land cover is classified according to indicator ranges for each (SI) and (NMDI).
ABSTRACT
Naproxen(NPX) imprinted liquid electrodes of polymers are built using polymerization precipitation. The molecularly imprinted (MIP) and non imprinted (NIP) polymers were synthesized using NPX as a template. In the polymerization precipitation involved, styrene(STY) was used as monomer, N,N-methylenediacrylamide (N,N-MDAM) as a cross-linker and benzoyl peroxide (BPO) as an initiator. The molecularly imprinted membranes and the non-imprinted membranes were prepared using acetophenone(AOPH) and di octylphathalate(DOP)as plasticizers in PVC matrix. The slopes and detection limits of the liquid electrodes ranged from)-18.1,-17.72 (mV/decade and )4.0 x 10-
... Show MoreThe performance quality and searching speed of Block Matching (BM) algorithm are affected by shapes and sizes of the search patterns used in the algorithm. In this paper, Kite Cross Hexagonal Search (KCHS) is proposed. This algorithm uses different search patterns (kite, cross, and hexagonal) to search for the best Motion Vector (MV). In first step, KCHS uses cross search pattern. In second step, it uses one of kite search patterns (up, down, left, or right depending on the first step). In subsequent steps, it uses large/small Hexagonal Search (HS) patterns. This new algorithm is compared with several known fast block matching algorithms. Comparisons are based on search points and Peak Signal to Noise Ratio (PSNR). According to resul
... Show More
It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jackna
... Show More
It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jackna
... Show MoreIn this paper, previous studies about Fuzzy regression had been presented. The fuzzy regression is a generalization of the traditional regression model that formulates a fuzzy environment's relationship to independent and dependent variables. All this can be introduced by non-parametric model, as well as a semi-parametric model. Moreover, results obtained from the previous studies and their conclusions were put forward in this context. So, we suggest a novel method of estimation via new weights instead of the old weights and introduce
Paper Type: Review article.
another suggestion based on artificial neural networks.