Vegetation monitoring is considered an important application in remote sensing task due to variation of vegetation types and their distribution. The vegetation concentration around the Earth is increase in 5% in 2000 according to NASA monitoring. This increase is due to the Indian vegetable programs. In this research, the vegetation monitoring in Baghdad city was done using Normalized Difference Vegetation Index (NDVI) for temporal Landsat satellite images (Landsat 5 TM& Landsat 8 OIL). These images had been used and utilize in different times during the period from 2000, 2010, 2015 & 2017. The outcomes of the study demonstrate that a change in the vegetation Cover (VC) in Baghdad city. (NDVI) generally shows a low value of plant cover. The highest NDVI values were occur in 2000 and the lowest values for both years 2015-2017. This change is due to a correlation of climate indices such as precipitation, temperature, and dust storms. This study present that (NDVI) method is a powerful and useful way of monitoring vegetation. The calculation of vegetable areas show (43.3, 37.4, 9.1, and 22.7 Km2). The result were evaluated using (Environment for Visualizing Images ENVI) Ver. 4.8 package.
This paper introduces a Laplace-based modeling approach for the study of transient converter-grid interactions. The proposed approach is based on the development of two-port admittance models of converters and other components, combined with the use of numerical Laplace transforms. The application of a frequency domain method is aimed at the accurate and straightforward computation of transient system responses while preserving the wideband frequency characteristics of power components, such as those due to the use of high frequency semiconductive switches, electromagnetic interaction between inductive and capacitive components, as well as wave propagation and frequency dependence in transmission systems.
This paper presents a numerical scheme for solving nonlinear time-fractional differential equations in the sense of Caputo. This method relies on the Laplace transform together with the modified Adomian method (LMADM), compared with the Laplace transform combined with the standard Adomian Method (LADM). Furthermore, for the comparison purpose, we applied LMADM and LADM for solving nonlinear time-fractional differential equations to identify the differences and similarities. Finally, we provided two examples regarding the nonlinear time-fractional differential equations, which showed that the convergence of the current scheme results in high accuracy and small frequency to solve this type of equations.
Schiff base (methyl 6-(2- (4-hydroxyphenyl) -2- (1-phenyl ethyl ideneamino) acetamido) -3, 3-dimethyl-7-oxo-4-thia-1-azabicyclo[3.2.0] heptane-2-carboxylate)Co(II), Ni(II), Cu (II), Zn (II), and Hg(II)] ions were employed to make certain complexes. Metal analysis M percent, elemental chemical analysis (C.H.N.S), and other standard physico-chemical methods were used. Magnetic susceptibility, conductometric measurements, FT-IR and UV-visible Spectra were used to identified. Theoretical treatment of the generated complexes in the gas phase was performed using the (hyperchem-8.07) program for molecular mechanics and semi-empirical computations. The (PM3) approach was used to determine the heat of formation (ΔH˚f), binding energy (ΔEb), an
... Show MoreOne of the important differences between multiwavelets and scalar wavelets is that each channel in the filter bank has a vector-valued input and a vector-valued output. A scalar-valued input signal must somehow be converted into a suitable vector-valued signal. This conversion is called preprocessing. Preprocessing is a mapping process which is done by a prefilter. A postfilter just does the opposite.
The most obvious way to get two input rows from a given signal is to repeat the signal. Two rows go into the multifilter bank. This procedure is called “Repeated Row” which introduces oversampling of the data by a factor of 2.
For data compression, where one is trying to find compact transform representations for a
... Show MoreIn this paper, an efficient method for compressing color image is presented. It allows progressive transmission and zooming of the image without need to extra storage. The proposed method is going to be accomplished using cubic Bezier surface (CBI) representation on wide area of images in order to prune the image component that shows large scale variation. Then, the produced cubic Bezier surface is subtracted from the image signal to get the residue component. Then, bi-orthogonal wavelet transform is applied to decompose the residue component. Both scalar quantization and quad tree coding steps are applied on the produced wavelet sub bands. Finally, adaptive shift coding is applied to handle the remaining statistical redundancy and attain e
... Show MoreIn this paper, an adaptive polynomial compression technique is introduced of hard and soft thresholding of transformed residual image that efficiently exploited both the spatial and frequency domains, where the technique starts by applying the polynomial coding in the spatial domain and then followed by the frequency domain of discrete wavelet transform (DWT) that utilized to decompose the residual image of hard and soft thresholding base. The results showed the improvement of adaptive techniques compared to the traditional polynomial coding technique.
Identifying people by their ear has recently received import attention in the literature. The accurate segmentation of the ear region is vital in order to make successful person identification decisions. This paper presents an effective approach for ear region segmentation from color ear images. Firstly, the RGB color model was converted to the HSV color model. Secondly, thresholding was utilized to segment the ear region. Finally, the morphological operations were applied to remove small islands and fill the gaps. The proposed method was tested on a database which consisted of 105 ear images taken from the right sides of 105 subjects. The experimental results of the proposed approach on a variety of ear images revealed that this approac
... Show MoreImage content verification is to confirm the validity of the images, i.e. . To test if the image has experienced any alteration since it was made. Computerized watermarking has turned into a promising procedure for image content verification in light of its exceptional execution and capacity of altering identification.
In this study, a new scheme for image verification reliant on two dimensional chaotic maps and Discrete Wavelet Transform (DWT) is introduced. Arnold transforms is first applied to Host image (H) for scrambling as a pretreatment stage, then the scrambled host image is partitioned into sub-blocks of size 2×2 in which a 2D DWT is utilized on ea
... Show MoreThe source and channel coding for wireless data transmission can reduce
distortion, complexity and delay in multimedia services. In this paper, a joint sourcechannel
coding is proposed for orthogonal frequency division multiplexing -
interleave division multiple access (OFDM-IDMA) systems to transmit the
compressed images over noisy channels. OFDM-IDMA combines advantages of
both OFDM and IDMA, where OFDM removes inter symbol interference (ISI)
problems and IDMA removes multiple access interference (MAI). Convolutional
coding is used as a channel coding, while the hybrid compression method is used as
a source coding scheme. The hybrid compression scheme is based on wavelet
transform, bit plane slicing, polynomi
Most of today’s techniques encrypt all of the image data, which consumes a tremendous amount of time and computational payload. This work introduces a selective image encryption technique that encrypts predetermined bulks of the original image data in order to reduce the encryption/decryption time and the
computational complexity of processing the huge image data. This technique is applying a compression algorithm based on Discrete Cosine Transform (DCT). Two approaches are implemented based on color space conversion as a preprocessing for the compression phases YCbCr and RGB, where the resultant compressed sequence is selectively encrypted using randomly generated combined secret key.
The results showed a significant reduct