In this paper three techniques for image compression are implemented. The proposed techniques consist of three dimension (3-D) two level discrete wavelet transform (DWT), 3-D two level discrete multi-wavelet transform (DMWT) and 3-D two level hybrid (wavelet-multiwavelet transform) technique. Daubechies and Haar are used in discrete wavelet transform and Critically Sampled preprocessing is used in discrete multi-wavelet transform. The aim is to maintain to increase the compression ratio (CR) with respect to increase the level of the transformation in case of 3-D transformation, so, the compression ratio is measured for each level. To get a good compression, the image data properties, were measured, such as, image entropy (He), percent root-mean-square difference (PRD %), energy retained (Er) and Peak Signal to Noise Ratio (PSNR). Based on testing results, a comparison between the three techniques is presented. CR in the three techniques is the same and has the largest value in the 2nd level of 3-D. The hybrid
technique has the highest PSNR values in the 1st and 2nd level of 3-D and has the lowest values of (PRD %). so, the 3-D 2-level hybrid is the best technique for image compression
Journal of Theoretical and Applied Information Technology is a peer-reviewed electronic research papers & review papers journal with aim of promoting and publishing original high quality research dealing with theoretical and scientific aspects in all disciplines of IT (Informaiton Technology
The data preprocessing step is an important step in web usage mining because of the nature of log data, which are heterogeneous, unstructured, and noisy. Given the scalability and efficiency of algorithms in pattern discovery, a preprocessing step must be applied. In this study, the sequential methodologies utilized in the preprocessing of data from web server logs, with an emphasis on sub-phases, such as session identification, user identification, and data cleansing, are comprehensively evaluated and meticulously examined.
The estimation of the regular regression model requires several assumptions to be satisfied such as "linearity". One problem occurs by partitioning the regression curve into two (or more) parts and then joining them by threshold point(s). This situation is regarded as a linearity violation of regression. Therefore, the multiphase regression model is received increasing attention as an alternative approach which describes the changing of the behavior of the phenomenon through threshold point estimation. Maximum likelihood estimator "MLE" has been used in both model and threshold point estimations. However, MLE is not resistant against violations such as outliers' existence or in case of the heavy-tailed error distribution. The main goal of t
... Show MoreThis research aims to choose the appropriate probability distribution to the reliability analysis for an item through collected data for operating and stoppage time of the case study.
Appropriate choice for .probability distribution is when the data look to be on or close the form fitting line for probability plot and test the data for goodness of fit .
Minitab’s 17 software was used for this purpose after arranging collected data and setting it in the the program.
&nb
... Show MoreDue to severe scouring, many bridges failed worldwide. Therefore, the safety of the existing bridge (after contrition) mainly depends on the continuous monitoring of local scour at the substructure. However, the bridge's safety before construction mainly depends on the consideration of local scour estimation at the bridge substructure. Estimating the local scour at the bridge piers is usually done using the available formulae. Almost all the formulae used in estimating local scour at the bridge piers were derived from laboratory data. It is essential to test the performance of proposed local scour formulae using field data. In this study, the performance of selected bridge scours estimation formulae was validated and sta
... Show MoreThis study includes the direct influence of (single & multi) dose of Cold Atmospheric Plasma (CAP) on the no. of platelets for mice for different exposure time (15, 30, 60, and 120) sec. the influence of CAP on mice was measured after 1, 2, 3, 7, and 14 day from exposure.
The results obtained in this study indicate that the effect of low doses of CAP on platelets was stimulatory effect in the first few hours from exposure (1day) but the high dose was inhibitory, It was found that after two weeks of exposure that the number of platelets became normal comparable to the control one, and this indicates that plasma effect was removed after this period.
A Strength Pareto Evolutionary Algorithm 2 (SPEA 2) approach for solving the multi-objective Environmental / Economic Power Dispatch (EEPD) problem is presented in this paper. In the past fuel cost consumption minimization was the aim (a single objective function) of economic power dispatch problem. Since the clean air act amendments have been applied to reduce SO2 and NOX emissions from power plants, the utilities change their strategies in order to reduce pollution and atmospheric emission as well, adding emission minimization as other objective function made economic power dispatch (EPD) a multi-objective problem having conflicting objectives. SPEA2 is the improved version of SPEA with better fitness assignment, density estimation, an
... Show MoreI
In this study, optical fibers were designed and implemented as a chemical sensor based on surface plasmon resonance (SPR) to estimate the age of the oil used in electrical transformers. The study depends on the refractive indices of the oil. The sensor was created by embedding the center portion of the optical fiber in a resin block, followed by polishing, and tapering to create the optical fiber sensor. The tapering time was 50 min. The multi-mode optical fiber was coated with 60 nm thickness gold metal. The deposition length was 4 cm. The sensor's resonance wavelength was 415 nm. The primary sensor parameters were calculated, including sensitivity (6.25), signal-to-noise ratio (2.38), figure of merit (4.88), and accuracy (3.2)
... Show MoreBrainstorming has been a common approach in many industries where the result is not always accurate, especially when procuring automobile spare parts. This approach was replaced with a scientific and optimized method that is highly reliable, hence the decision to optimize the inventory inflation budget based on spare parts and miscellaneous costs of the typical automobile industry. Some factors required to achieve this goal were investigated. Through this investigation, spare parts (consumables and non-consumables) were found to be mostly used in Innoson Vehicle Manufacturing (IVM), Nigeria but incorporated miscellaneous costs to augment the cost of spare parts. The inflation rate was considered first due to the market's
... Show More