In this work a study and calculation of the normal approach between two bodies,
spherical and rough flat surface, had been conducted by the aid of image processing
technique. Four kinds of metals of different work hardening index had been used as a
surface specimens and by capturing images of resolution of 0.006565 mm/pixel a good estimate of the normal approach may be obtained the compression tests had been done in strength of material laboratory in mechanical engineering department, a Monsanto tensometer had been used to conduct the indentation tests. A light section measuring equipment microscope BK 70x50 was used to calculate the surface parameters of the texture profile like standard deviation of asperity peak heights, centre line average, asperity density and the radius of asperities. A Gaussian distribution of asperity peak height was assumed in calculating the theoretical value of the normal approach in the elastic and plastic regions and where compared with those obtained experimentally to verify the obtained results.
Modern civilization increasingly relies on sustainable and eco-friendly data centers as the core hubs of intelligent computing. However, these data centers, while vital, also face heightened vulnerability to hacking due to their role as the convergence points of numerous network connection nodes. Recognizing and addressing this vulnerability, particularly within the confines of green data centers, is a pressing concern. This paper proposes a novel approach to mitigate this threat by leveraging swarm intelligence techniques to detect prospective and hidden compromised devices within the data center environment. The core objective is to ensure sustainable intelligent computing through a colony strategy. The research primarily focusses on the
... Show MoreThe present theoretical study analyzes the legacy of the Chicago School of Urban Sociology and evaluates it in the light of the growth and development of Chicago City and the establishment of sociology in it. Sociology has become an academic discipline recognized in the United States of America in the late nineteenth century, particularly, after the establishment of the first department of sociology in the University of Chicago in 1892. That was during the period of the rapid industrialization and sustainable growth of the Chicago City. The Chicago School relied on Chicago City in particular, as one of the American cities that grew and expanded rapidly in the first two decades of the twentieth century. At the end of the nineteenth centur
... Show MoreThe article reflects the results of the analysis of the use of metaphors as one of the main means used by Lyudmila Ulitskaya when writing the novel “Sincerely Yours Shurik” to form the image of the main hero in the novel. The main purpose of the article is to consider metaphors, which helped the author to form the image of the main character Shurik in the text space through the stages of his life path, closely related to the people around him, who is always happy to be useful (hence the title "Sincerely Yours"), among which the female images of his relatives, girlfriends and others stand out as a special layer in the narrative. And in the course of the study, the following tasks were solved: the metaphors that make up the image of the
... Show MoreNeighShrink is an efficient image denoising algorithm based on the discrete wavelet
transform (DWT). Its disadvantage is to use a suboptimal universal threshold and identical
neighbouring window size in all wavelet subbands. Dengwen and Wengang proposed an
improved method, which can determine an optimal threshold and neighbouring window size
for every subband by the Stein’s unbiased risk estimate (SURE). Its denoising performance is
considerably superior to NeighShrink and also outperforms SURE-LET, which is an up-todate
denoising algorithm based on the SURE. In this paper different wavelet transform
families are used with this improved method, the results show that Haar wavelet has the
lowest performance among
Groupwise non-rigid image alignment is a difficult non-linear optimization problem involving many parameters and often large datasets. Previous methods have explored various metrics and optimization strategies. Good results have been previously achieved with simple metrics, requiring complex optimization, often with many unintuitive parameters that require careful tuning for each dataset. In this chapter, the problem is restructured to use a simpler, iterative optimization algorithm, with very few free parameters. The warps are refined using an iterative Levenberg-Marquardt minimization to the mean, based on updating the locations of a small number of points and incorporating a stiffness constraint. This optimization approach is eff
... Show MoreSecurity concerns in the transfer of medical images have drawn a lot of attention to the topic of medical picture encryption as of late. Furthermore, recent events have brought attention to the fact that medical photographs are constantly being produced and circulated online, necessitating safeguards against their inappropriate use. To improve the design of the AES algorithm standard for medical picture encryption, this research presents several new criteria. It was created so that needs for higher levels of safety and higher levels of performance could be met. First, the pixels in the image are diffused to randomly mix them up and disperse them all over the screen. Rather than using rounds, the suggested technique utilizes a cascad
... Show MoreThis paper investigated the treatment of textile wastewater polluted with aniline blue (AB) by electrocoagulation process using stainless steel mesh electrodes with a horizontal arrangement. The experimental design involved the application of the response surface methodology (RSM) to find the mathematical model, by adjusting the current density (4-20 mA/cm2), distance between electrodes (0.5-3 cm), salt concentration (50-600 mg/l), initial dye concentration (50-250 mg/l), pH value (2-12 ) and experimental time (5-20 min). The results showed that time is the most important parameter affecting the performance of the electrocoagulation system. Maximum removal efficiency (96 %) was obtained at a current density of 20 mA/cm2, distance be
... Show MoreRation power plants, to generate power, have become common worldwide. One such one is the steam power plant. In such plants, various moving parts of heavy machines generate a lot of noise. Operators are subjected to high levels of noise. High noise level exposure leads to psychological as well physiological problems; different kinds of ill effects. It results in deteriorated work efficiency, although the exact nature of work performance is still unknown. To predict work efficiency deterioration, neuro-fuzzy tools are being used in research. It has been established that a neuro-fuzzy computing system helps in identification and analysis of fuzzy models. The last decade has seen substantial growth in development of various neuro-fuzzy systems
... Show MoreThis paper investigated the treatment of textile wastewater polluted with aniline blue (AB) by electrocoagulation process using stainless steel mesh electrodes with a horizontal arrangement. The experimental design involved the application of the response surface methodology (RSM) to find the mathematical model, by adjusting the current density (4-20 mA/cm2), distance between electrodes (0.5-3 cm), salt concentration (50-600 mg/l), initial dye concentration (50-250 mg/l), pH value (2-12 ) and experimental time (5-20 min). The results showed that time is the most important parameter affecting the performance of the electrocoagulation system. Maximum removal efficiency (96 %) was obtained at a current density of 20 mA/cm2, distance between
... Show More