Fractal image compression depends on representing an image using affine transformations. The main concern for researches in the discipline of fractal image compression (FIC) algorithm is to decrease encoding time needed to compress image data. The basic technique is that each portion of the image is similar to other portions of the same image. In this process, there are many models that were developed. The presence of fractals was initially noticed and handled using Iterated Function System (IFS); that is used for encoding images. In this paper, a review of fractal image compression is discussed with its variants along with other techniques. A summarized review of contributions is achieved to determine the fulfillment of fractal image compression, specifically for the block indexing methods based on the moment descriptor. Block indexing method depends on classifying the domain and range blocks using moments to generate an invariant descriptor that reduces the long encoding time. A comparison is performed between the blocked indexing technology and other fractal image techniques to determine the importance of block indexing in saving encoding time and achieving better compression ratio while maintaining image quality on Lena image.
Reservoir characterization is an important component of hydrocarbon exploration and production, which requires the integration of different disciplines for accurate subsurface modeling. This comprehensive research paper delves into the complex interplay of rock materials, rock formation techniques, and geological modeling techniques for improving reservoir quality. The research plays an important role dominated by petrophysical factors such as porosity, shale volume, water content, and permeability—as important indicators of reservoir properties, fluid behavior, and hydrocarbon potential. It examines various rock cataloging techniques, focusing on rock aggregation techniques and self-organizing maps (SOMs) to identify specific and
... Show MoreThis work aims to investigate the tensile and compression strengths of heat- cured acrylic resin denture base material by adding styrene-butadiene (S- B) to polymethyl methacrylate (PMMA). The most well- known issue in prosthodontic practice is fracture of a denture base. All samples were a blend of (90%, 80%) PMMA and (10%, 20%) S- B powder melted in Oxolane (Tetra hydro furan). These samples were chopped down into specimens of dimensions 100x10x2.5mm to carry out the requirements of tensile tests. The compression strength test specimens were shaped into a cylinder with dimensions of 12.7mm in diameter and 20mm in length. The experimental results show a significant increase in both tensile and compression strengths when compared to cont
... Show MoreA new approach for baud time (or baud rate) estimation of a random binary signal is presented. This approach utilizes the spectrum of the signal after nonlinear processing in a way that the estimation error can be reduced by simply increasing the number of the processed samples instead of increasing the sampling rate. The spectrum of the new signal is shown to give an accurate estimate about the baud time when there is no apriory information or any restricting preassumptions. The performance of the estimator for random binary square waves perturbed by white Gaussian noise and ISI is evaluated and compared with that of the conventional estimator of the zero crossing detector.
This paper proposes a new password generation technique on the basis of mouse motion and a special case location recognized by the number of clicks to protect sensitive data for different companies. Two, three special locations click points for the users has been proposed to increase password complexity. Unlike other currently available random password generators, the path and number of clicks will be added by admin, and authorized users have to be training on it.
This method aims to increase combinations for the graphical password generation using mouse motion for a limited number of users. A mathematical model is developed to calculate the performance
Solid dispersion (SD) is one of the most widely used methods to resolve issues accompanied by poorly soluble drugs. The present study was carried out to enhance the solubility and dissolution rate of Aceclofenac (ACE), a BCS class II drug with pH-dependent solubility, by the SD method. Effervescent assisted fusion technique (EFSD) using different hydrophilic carriers (mannitol, urea, Soluplus®, poloxamer 188, and poloxamer 407) in the presence of an effervescent base (sodium bicarbonate and citric acid) in different drug: carrier: effervescent base ratio and the conventional fusion technique (FSD) were used to prepare ACE SD. Solubility, dissolution rate, Fourier transformation infrared spectroscopy (FTIR), PowderX-ray diffraction
... Show More
In this work, a novel technique to obtain an accurate solutions to nonlinear form by multi-step combination with Laplace-variational approach (MSLVIM) is introduced. Compared with the traditional approach for variational it overcome all difficulties and enable to provide us more an accurate solutions with extended of the convergence region as well as covering to larger intervals which providing us a continuous representation of approximate analytic solution and it give more better information of the solution over the whole time interval. This technique is more easier for obtaining the general Lagrange multiplier with reduces the time and calculations. It converges rapidly to exact formula with simply computable terms wit
... Show MoreINFLUENCE OF SOME FACTOR ON SOMATIC EMBRYOS INDUCTION AND GERMINATION OF DATE PALM BARHI C.V BY USING CELL SUSPENSION CULTURE TECHNIQUE
Multiple eliminations (de-multiple) are one of seismic processing steps to remove their effects and delineate the correct primary refractors. Using normal move out to flatten primaries is the way to eliminate multiples through transforming these data to frequency-wavenumber domain. The flatten primaries are aligned with zero axis of the frequency-wavenumber domain and any other reflection types (multiples and random noise) are distributed elsewhere. Dip-filter is applied to pass the aligned data and reject others will separate primaries from multiple after transforming the data back from frequency-wavenumber domain to time-distance domain. For that, a suggested name for this technique as normal move out- frequency-wavenumber domain
... Show More