In this work a study and calculation of the normal approach between two bodies, spherical and rough flat surface, had been conducted by the aid of image processing technique. Four kinds of metals of different work hardening index had been used as a surface specimens and by capturing images of resolution of 0.006565 mm/pixel a good estimate of the normal approach may be obtained the compression tests had been done in strength of material laboratory in mechanical engineering department, a Monsanto tensometer had been used to conduct the indentation tests.
A light section measuring equipment microscope BK 70x50 was used to calculate the surface parameters of the texture profile like standard deviation of asperity peak heights, centre line average, asperity density and the radius of asperities.
A Gaussian distribution of asperity peak height was assumed in calculating the theoretical value of the normal approach in the elastic and plastic regions and where compared with those obtained experimentally to verify the obtained results.
This article investigates the decline of language loyalty in the age of audiovisual nearness. It is a socio-linguistic review of previous literature related to language disloyalty. It reviews the current theoretical efforts on the impact of audiovisual nearness created by social media and language loyalty. The descriptive design is used. The argument behind this review is that the audiovisual nearness provided by social media negatively affects language loyalty. This article concludes that the current theoretical efforts have paid much attention to the relationship between the audiovisual nearness and language loyalty. Such efforts have highlighted the fact that the social media platforms have provided unprecedented nearness that provoke in
... Show MoreThis paper suggest two method of recognition, these methods depend on the extraction of the feature of the principle component analysis when applied on the wavelet domain(multi-wavelet). First method, an idea of increasing the space of recognition, through calculating the eigenstructure of the diagonal sub-image details at five depths of wavelet transform is introduced. The effective eigen range selected here represent the base for image recognition. In second method, an idea of obtaining invariant wavelet space at all projections is presented. A new recursive from that represents invariant space of representing any image resolutions obtained from wavelet transform is adopted. In this way, all the major problems that effect the image and
... Show MoreIn this research, an analysis for the standard Hueckel edge detection algorithm behaviour by using three dimensional representations for the edge goodness criterion is presents after applying it on a real high texture satellite image, where the edge goodness criterion is analysis statistically. The Hueckel edge detection algorithm showed a forward exponential relationship between the execution time with the used disk radius. Hueckel restrictions that mentioned in his papers are adopted in this research. A discussion for the resultant edge shape and malformation is presented, since this is the first practical study of applying Hueckel edge detection algorithm on a real high texture image containing ramp edges (satellite image).
Secured multimedia data has grown in importance over the last few decades to safeguard multimedia content from unwanted users. Generally speaking, a number of methods have been employed to hide important visual data from eavesdroppers, one of which is chaotic encryption. This review article will examine chaotic encryption methods currently in use, highlighting their benefits and drawbacks in terms of their applicability for picture security.
In this paper the behavior of the quality of the gradient that implemented on an image as a function of noise error is presented. The cross correlation coefficient (ccc) between the derivative of the original image before and after introducing noise error shows dramatic decline compared with the corresponding images before taking derivatives. Mathematical equations have been constructed to control the relation between (ccc) and the noise parameter.
OpenStreetMap (OSM), recognised for its current and readily accessible spatial database, frequently serves regions lacking precise data at the necessary granularity. Global collaboration among OSM contributors presents challenges to data quality and uniformity, exacerbated by the sheer volume of input and indistinct data annotation protocols. This study presents a methodological improvement in the spatial accuracy of OSM datasets centred over Baghdad, Iraq, utilising data derived from OSM services and satellite imagery. An analytical focus was placed on two geometric correction methods: a two-dimensional polynomial affine transformation and a two-dimensional polynomial conformal transformation. The former involves twelve coefficients for ad
... Show MoreThe main objective of this research is to use the methods of calculus ???????? solving integral equations Altbataah When McCann slowdown is a function of time as the integral equation used in this research is a kind of Volterra