ArcHydro is a model developed for building hydrologic information systems to synthesize geospatial and temporal water resources data that support hydrologic modeling and analysis. Raster-based digital elevation models (DEMs) play an important role in distributed hydrologic modeling supported by geographic information systems (GIS). Digital Elevation Model (DEM) data have been used to derive hydrological features, which serve as inputs to various models. Currently, elevation data are available from several major sources and at different spatial resolutions. Detailed delineation of drainage networks is the first step for many natural resource management studies. Compared with interpretation from aerial photographs or topographic maps, automation of drainage network extraction from DEMs is an efficient way and has received considerable attention. This study aims to extract drainage networks from Digital Elevation Model (DEM) for Lesser Zab River Basin. Composition parameters of the drainage network including the numbers of streams and the stream lengths are derived from the DEM beside the delineation of catchment areas in the basin. The results from this application can be used to create input files for many hydrologic models.
Two different oxidative desulfurization strategies based on oxidation/adsorption or oxidation/extraction were evaluated for the desulfurization of AL-Ahdab (AHD) sour crude oil (3.9wt% sulfur content). In the oxidation process, a homogenous oxidizing agent comprising of hydrogen peroxide and formic acid was used. Activated carbons were used as sorbent/catalyst in the oxidation/adsorption process while acetonitrile was used as an extraction solvent in the oxidation/extraction process. For the oxidation/adsorption scheme, the experimental results indicated that the oxidation desulfurization efficiency was enhanced on using activated carbon as catalyst/sorbent. The effects of the operating conditions (contact time, temperat
... Show MoreIn this research, an analysis for the standard Hueckel edge detection algorithm behaviour by using three dimensional representations for the edge goodness criterion is presents after applying it on a real high texture satellite image, where the edge goodness criterion is analysis statistically. The Hueckel edge detection algorithm showed a forward exponential relationship between the execution time with the used disk radius. Hueckel restrictions that mentioned in his papers are adopted in this research. A discussion for the resultant edge shape and malformation is presented, since this is the first practical study of applying Hueckel edge detection algorithm on a real high texture image containing ramp edges (satellite image).
Fractal image compression gives some desirable properties like fast decoding image, and very good rate-distortion curves, but suffers from a high encoding time. In fractal image compression a partitioning of the image into ranges is required. In this work, we introduced good partitioning process by means of merge approach, since some ranges are connected to the others. This paper presents a method to reduce the encoding time of this technique by reducing the number of range blocks based on the computing the statistical measures between them . Experimental results on standard images show that the proposed method yields minimize (decrease) the encoding time and remain the quality results passable visually.
<span>Digital audio is required to transmit large sizes of audio information through the most common communication systems; in turn this leads to more challenges in both storage and archieving. In this paper, an efficient audio compressive scheme is proposed, it depends on combined transform coding scheme; it is consist of i) bi-orthogonal (tab 9/7) wavelet transform to decompose the audio signal into low & multi high sub-bands, ii) then the produced sub-bands passed through DCT to de-correlate the signal, iii) the product of the combined transform stage is passed through progressive hierarchical quantization, then traditional run-length encoding (RLE), iv) and finally LZW coding to generate the output mate bitstream.
... Show MoreFractal image compression depends on representing an image using affine transformations. The main concern for researches in the discipline of fractal image compression (FIC) algorithm is to decrease encoding time needed to compress image data. The basic technique is that each portion of the image is similar to other portions of the same image. In this process, there are many models that were developed. The presence of fractals was initially noticed and handled using Iterated Function System (IFS); that is used for encoding images. In this paper, a review of fractal image compression is discussed with its variants along with other techniques. A summarized review of contributions is achieved to determine the fulfillment of fractal ima
... Show MoreThe sensitive and important data are increased in the last decades rapidly, since the tremendous updating of networking infrastructure and communications. to secure this data becomes necessary with increasing volume of it, to satisfy securing for data, using different cipher techniques and methods to ensure goals of security that are integrity, confidentiality, and availability. This paper presented a proposed hybrid text cryptography method to encrypt a sensitive data by using different encryption algorithms such as: Caesar, Vigenère, Affine, and multiplicative. Using this hybrid text cryptography method aims to make the encryption process more secure and effective. The hybrid text cryptography method depends on circular queue. Using circ
... Show More