In this work a study and calculation of the normal approach between two bodies,
spherical and rough flat surface, had been conducted by the aid of image processing
technique. Four kinds of metals of different work hardening index had been used as a
surface specimens and by capturing images of resolution of 0.006565 mm/pixel a good estimate of the normal approach may be obtained the compression tests had been done in strength of material laboratory in mechanical engineering department, a Monsanto tensometer had been used to conduct the indentation tests. A light section measuring equipment microscope BK 70x50 was used to calculate the surface parameters of the texture profile like standard deviation of asperity peak heights, centre line average, asperity density and the radius of asperities. A Gaussian distribution of asperity peak height was assumed in calculating the theoretical value of the normal approach in the elastic and plastic regions and where compared with those obtained experimentally to verify the obtained results.
Background: The success and maintenance of indirect dental restorations is closely related to the marginal accuracy, which is affected by many factors like preparation design, using of different fabrication techniques, and the time of taking final impression and pouring it. The purpose of this in vitro study was to evaluate the effect of different pouring time of conventional impression on the vertical marginal gap of full contour zirconia crowns in comparison with digital impression technique. Materials and Methods: Forty sound recently extracted human permanent maxillary first premolar teeth of comparable size and shape were collected. Standardized preparation of all teeth samples were carried out to receive full contour zirconia crown re
... Show MoreIn this research, (MOORA) approach based– Taguchi design was used to convert the multi-performance problem into a single-performance problem for nine experiments which built (Taguchi (L9) orthogonal array) for carburization operation. The main variables that had a great effect on carburizing operation are carburization temperature (oC), carburization time (hrs.) and tempering temperature (oC). This study was also focused on calculating the amount of carbon penetration, the value of hardness and optimal values obtained during the optimization by Taguchi approach and MOORA method for multiple parameters. In this study, the carburization process was done in temperature between (850 to 950 ᵒC) for 2 to 6
... Show MoreThis paper introduces an innovative method for image encryption called "Two-Fold Cryptography," which leverages the Henon map in a dual-layer encryption framework. By applying two distinct encryption processes, this approach offers enhanced security for images. Key parameters generated by the Henon map dynamically shape both stages of encryption, creating a sophisticated and robust security system. The findings reveal that Two-Fold Cryptography provides a notable improvement in image protection, outperforming traditional single-layer encryption techniques.
In this work, satellite images for Razaza Lake and the surrounding area
district in Karbala province are classified for years 1990,1999 and
2014 using two software programming (MATLAB 7.12 and ERDAS
imagine 2014). Proposed unsupervised and supervised method of
classification using MATLAB software have been used; these are
mean value and Singular Value Decomposition respectively. While
unsupervised (K-Means) and supervised (Maximum likelihood
Classifier) method are utilized using ERDAS imagine, in order to get
most accurate results and then compare these results of each method
and calculate the changes that taken place in years 1999 and 2014;
comparing with 1990. The results from classification indicated that
In this paper, a method is proposed to increase the compression ratio for the color images by
dividing the image into non-overlapping blocks and applying different compression ratio for these
blocks depending on the importance information of the block. In the region that contain important
information the compression ratio is reduced to prevent loss of the information, while in the
smoothness region which has not important information, high compression ratio is used .The
proposed method shows better results when compared with classical methods(wavelet and DCT).
In this paper, we designed a new efficient stream cipher cryptosystem that depend on a chaotic map to encrypt (decrypt) different types of digital images. The designed encryption system passed all basic efficiency criteria (like Randomness, MSE, PSNR, Histogram Analysis, and Key Space) that were applied to the key extracted from the random generator as well as to the digital images after completing the encryption process.
Fractal image compression depends on representing an image using affine transformations. The main concern for researches in the discipline of fractal image compression (FIC) algorithm is to decrease encoding time needed to compress image data. The basic technique is that each portion of the image is similar to other portions of the same image. In this process, there are many models that were developed. The presence of fractals was initially noticed and handled using Iterated Function System (IFS); that is used for encoding images. In this paper, a review of fractal image compression is discussed with its variants along with other techniques. A summarized review of contributions is achieved to determine the fulfillment of fractal ima
... Show MoreFractal image compression gives some desirable properties like fast decoding image, and very good rate-distortion curves, but suffers from a high encoding time. In fractal image compression a partitioning of the image into ranges is required. In this work, we introduced good partitioning process by means of merge approach, since some ranges are connected to the others. This paper presents a method to reduce the encoding time of this technique by reducing the number of range blocks based on the computing the statistical measures between them . Experimental results on standard images show that the proposed method yields minimize (decrease) the encoding time and remain the quality results passable visually.
In this paper, an algorithm through which we can embed more data than the
regular methods under spatial domain is introduced. We compressed the secret data
using Huffman coding and then this compressed data is embedded using laplacian
sharpening method.
We used Laplace filters to determine the effective hiding places, then based on
threshold value we found the places with the highest values acquired from these filters
for embedding the watermark. In this work our aim is increasing the capacity of
information which is to be embedded by using Huffman code and at the same time
increasing the security of the algorithm by hiding data in the places that have highest
values of edges and less noticeable.
The perform