<span>Digital audio is required to transmit large sizes of audio information through the most common communication systems; in turn this leads to more challenges in both storage and archieving. In this paper, an efficient audio compressive scheme is proposed, it depends on combined transform coding scheme; it is consist of i) bi-orthogonal (tab 9/7) wavelet transform to decompose the audio signal into low & multi high sub-bands, ii) then the produced sub-bands passed through DCT to de-correlate the signal, iii) the product of the combined transform stage is passed through progressive hierarchical quantization, then traditional run-length encoding (RLE), iv) and finally LZW coding to generate the output mate bitstream. The measures Peak signal-to-noise ratio (PSNR) and compression ratio (CR) were used to conduct a comparative analysis for the performance of the whole system. Many audio test samples were utilized to test the performance behavior; the used samples have various sizes and vary in features. The simulation results appear the efficiency of these combined transforms when using LZW within the domain of data compression. The compression results are encouraging and show a remarkable reduction in audio file size with good fidelity.</span>
In the presence of deep submicron noise, providing reliable and energy‐efficient network on‐chip operation is becoming a challenging objective. In this study, the authors propose a hybrid automatic repeat request (HARQ)‐based coding scheme that simultaneously reduces the crosstalk induced bus delay and provides multi‐bit error protection while achieving high‐energy savings. This is achieved by calculating two‐dimensional parities and duplicating all the bits, which provide single error correction and six errors detection. The error correction reduces the performance degradation caused by retransmissions, which when combined with voltage swing reduction, due to its high error detection, high‐energy savings are achieved. The res
... Show MoreRhizobium bacteria was isolated from the root nodules of Medicago sativa plants and, based on morphological and some biochemical properties, it was characterized as Sinorhizobium meliloti. We studied the ability of this isolate, as well as that of Agrobacterium rhizogenes R1601, to produce the auxin indole acetic acid (IAA). For purposes of control, both isolates, in the absence of tryptophan-L, were similarly tested. The identification of IAA was achieved by checking the colour reactions with Salkowski’s reagent. Low amounts (23, 69 and 26,77 µɡ/ml) of IAA were produced by S.meliloti and A.rhizogenes after 24 and 72 hours of incubation, respectively. S.meliloti was
... Show MoreBackground. Dental implantation has become a standard procedure with high success rates, relying on achieving osseointegration between the implant surface and surrounding bone tissue. Polyether ether ketone (PEEK) is a promising alternative to traditional dental implant materials like titanium, but its osseointegration capabilities are limited due to its hydrophobic nature and reduced surface roughness. Objective. The aim of the study is to increase the surface roughness and hydrophilicity of PEEK by treating the surface with piranha solution and then coating the surface with epigallocatechin-3-gallate (EGCG) by electrospraying technique. Materials and Methods. The study includes four groups intended to investigate the effect of pir
... Show MoreHigh performance self-consolidating concrete HP-SCC is one of the most complex types of concrete which have the capacity to consolidated under its own weight, have excellent homogeneity and high durability. This study aims to focus on the possibility of using industrial by-products like Silica fumes SF in the preparation of HP-SCC enhanced with discrete steel fibers (DSF) and monofilament polypropylene fibers (PPF). From experimental results, it was found that using DSF with volume fraction of 0.50 %; a highly improvements were gained in the mechanical properties of HP-SCC. The compressive strength, splitting tensile strength, flexural strength and elastic modulus improved about 65.7 %, 70.5 %, 41.7 % and 80.3 % at 28 days age, respectively
... Show MoreThe current study performed in order to detect and quantify epicatechin in two tea samples of Camellia sinensis (black and green tea) by thin layer chromatography (TLC) and high performance liquid chromatography (HPLC). Extraction of epicatechin from black and green tea was done by using two different methods: maceration (cold extraction method) and decoction (hot extraction method) involved using three different solvents which are absolute ethanol, 50% aqueous ethanol and water for both extraction methods using room temperature and direct heat respectively. Crude extracts of two tea samples that obtained from two methods were fractionated by using two solvents with different polarity (chloroform and
... Show MoreA simple, precise, rapid, and accurate reversed – phase high performance liquid chromatographic method has been developed for the determination of guaifenesin in pure from pharmaceutical formulations.andindustrial effluent. Chromatography was carried out on supelco L7 reversed- phase column (25cm × 4.6mm), 5 microns, using a mixture of methanol –acetonitrile-water: (80: 10 :10 v/v/v) as a mobile phase at a flow rate of 1.0 ml.min-1. Detection was performed at 254nm at ambient temperature. The retention time for guaifenesin was found 2.4 minutes. The calibration curve was linear (r= 0.9998) over a concentration range from 0.08 to 0.8mg/ml. Limit of detection (LOD) and limit of quantification ( LOQ) were found 6µg/ml and 18µg/ml res
... Show MoreThis article aims to determine the time-dependent heat coefficient together with the temperature solution for a type of semi-linear time-fractional inverse source problem by applying a method based on the finite difference scheme and Tikhonov regularization. An unconditionally stable implicit finite difference scheme is used as a direct (forward) solver. While by the MATLAB routine lsqnonlin from the optimization toolbox, the inverse problem is reformulated as nonlinear least square minimization and solved efficiently. Since the problem is generally incorrect or ill-posed that means any error inclusion in the input data will produce a large error in the output data. Therefore, the Tikhonov regularization technique is applie
... Show More