Fractal image compression depends on representing an image using affine transformations. The main concern for researches in the discipline of fractal image compression (FIC) algorithm is to decrease encoding time needed to compress image data. The basic technique is that each portion of the image is similar to other portions of the same image. In this process, there are many models that were developed. The presence of fractals was initially noticed and handled using Iterated Function System (IFS); that is used for encoding images. In this paper, a review of fractal image compression is discussed with its variants along with other techniques. A summarized review of contributions is achieved to determine the fulfillment of fractal image compression, specifically for the block indexing methods based on the moment descriptor. Block indexing method depends on classifying the domain and range blocks using moments to generate an invariant descriptor that reduces the long encoding time. A comparison is performed between the blocked indexing technology and other fractal image techniques to determine the importance of block indexing in saving encoding time and achieving better compression ratio while maintaining image quality on Lena image.
Research covers the uses the method of Quality Rating Evaluation to evaluate the
quality of production through which a determination of product quality of its production in
order to determine the amount of sales hence the profits for the company. The most important
function is to satisfy consumer at reasonable prices. Methods were applied to the product
(toothpaste) in the General Company for Vegetable Oil – Almaamoon Factory .
The company's has obtained ISO-certified (ISO 9001-2008). Random samples of
final product intended for sale were collected from the store during months (February, April ,
June , October and December) for the year 2011 to determine the "quality rating " through
the applicat
A number of aqueous samples were collected from river Tigris in Baghdad city, enriched ~1000 times using solid phase extraction (SPE), then extracted the trace concentrations of some polychlorinated biphenyls (PCB) using an aqueous two-phase system (ATPS) composed of 1Methylpyridinium chloride [MePy]Cl and KH2PO4 salt. High performance liquid chromatography technique coupled with ultraviolet (HPLC-UV) is used for the quantification. Extraction under the optimized conditions of pH, solvent composition, duration and temperature has given with a yield of PCB about 91%. The limit of detection (LOD) and limit of quantification (LOQ) for analyses are 0.11-0.62 µg.L−1 and 2.67–3.43 µg.L−1 respectively with relative stan
... Show MoreIdentifying breast cancer utilizing artificial intelligence technologies is valuable and has a great influence on the early detection of diseases. It also can save humanity by giving them a better chance to be treated in the earlier stages of cancer. During the last decade, deep neural networks (DNN) and machine learning (ML) systems have been widely used by almost every segment in medical centers due to their accurate identification and recognition of diseases, especially when trained using many datasets/samples. in this paper, a proposed two hidden layers DNN with a reduction in the number of additions and multiplications in each neuron. The number of bits and binary points of inputs and weights can be changed using the mask configuration
... Show MoreIn this article we derive two reliability mathematical expressions of two kinds of s-out of -k stress-strength model systems; and . Both stress and strength are assumed to have an Inverse Lomax distribution with unknown shape parameters and a common known scale parameter. The increase and decrease in the real values of the two reliabilities are studied according to the increase and decrease in the distribution parameters. Two estimation methods are used to estimate the distribution parameters and the reliabilities, which are Maximum Likelihood and Regression. A comparison is made between the estimators based on a simulation study by the mean squared error criteria, which revealed that the maximum likelihood estimator works the best.
In recent years, the Global Navigation Satellite Services (GNSS) technology has been frequently employed for monitoring the Earth crust deformation and movement. Such applications necessitate high positional accuracy that can be achieved through processing GPS/GNSS data with scientific software such as BERENSE, GAMIT, and GIPSY-OSIS. Nevertheless, these scientific softwares are sophisticated and have not been published as free open source software. Therefore, this study has been conducted to evaluate an alternative solution, GNSS online processing services, which may obtain this privilege freely. In this study, eight years of GNSS raw data for TEHN station, which located in Iran, have been downloaded from UNAVCO website
... Show MoreThe basic concepts of some near open subgraphs, near rough, near exact and near fuzzy graphs are introduced and sufficiently illustrated. The Gm-closure space induced by closure operators is used to generalize the basic rough graph concepts. We introduce the near exactness and near roughness by applying the near concepts to make more accuracy for definability of graphs. We give a new definition for a membership function to find near interior, near boundary and near exterior vertices. Moreover, proved results, examples and counter examples are provided. The Gm-closure structure which suggested in this paper opens up the way for applying rich amount of topological facts and methods in the process of granular computing.
In this work, a joint quadrature for numerical solution of the double integral is presented. This method is based on combining two rules of the same precision level to form a higher level of precision. Numerical results of the present method with a lower level of precision are presented and compared with those performed by the existing high-precision Gauss-Legendre five-point rule in two variables, which has the same functional evaluation. The efficiency of the proposed method is justified with numerical examples. From an application point of view, the determination of the center of gravity is a special consideration for the present scheme. Convergence analysis is demonstrated to validate the current method.
The concept of separation axioms constitutes a key role in general topology and all generalized forms of topologies. The present authors continued the study of gpα-closed sets by utilizing this concept, new separation axioms, namely gpα-regular and gpα-normal spaces are studied and established their characterizations. Also, new spaces namely gpα-Tk for k = 0, 1, 2 are studied.