In this work a study and calculation of the normal approach between two bodies, spherical and rough flat surface, had been conducted by the aid of image processing technique. Four kinds of metals of different work hardening index had been used as a surface specimens and by capturing images of resolution of 0.006565 mm/pixel a good estimate of the normal approach may be obtained the compression tests had been done in strength of material laboratory in mechanical engineering department, a Monsanto tensometer had been used to conduct the indentation tests.
A light section measuring equipment microscope BK 70x50 was used to calculate the surface parameters of the texture profile like standard deviation of asperity peak heights, centre line average, asperity density and the radius of asperities.
A Gaussian distribution of asperity peak height was assumed in calculating the theoretical value of the normal approach in the elastic and plastic regions and where compared with those obtained experimentally to verify the obtained results.
Since the nineties of the last century, Iraqi youth have been exposed to Mexican soap operas dubbed into classical Arabic, and the stories and ideas presented by these series were almost new to all the minds of the youth at that time, a culture completely different from the culture we have of social relations, and since the number of episodes was more than Of the 100 episodes, exposure to these series has left young people confused by the addiction to all their ideas and stories. They differ from the foreign films (mostly American) that we used to watch, and they only take two hours (at most). These films contain diverse stories and may be forgotten. Memory includes events and characters in their entirety at times, and you may remember th
... Show MoreThe effect of short range correlations on the inelastic longitudinal Coulomb form
factors for the lowest four excited 2+ states in 18O is analyzed. This effect (which
depends on the correlation parameter β) is inserted into the ground state charge
density distribution through the Jastrow type correlation function. The single particle
harmonic oscillator wave function is used with an oscillator size parameter b. The
parameters β and b are, considered as free parameters, adjusted for each excited state
separately so as to reproduce the experimental root mean square charge radius of
18O. The model space of 18O does not contribute to the transition charge density. As
a result, the inelastic Coulomb form factor of 18
This paper proposes a new encryption method. It combines two cipher algorithms, i.e., DES and AES, to generate hybrid keys. This combination strengthens the proposed W-method by generating high randomized keys. Two points can represent the reliability of any encryption technique. Firstly, is the key generation; therefore, our approach merges 64 bits of DES with 64 bits of AES to produce 128 bits as a root key for all remaining keys that are 15. This complexity increases the level of the ciphering process. Moreover, it shifts the operation one bit only to the right. Secondly is the nature of the encryption process. It includes two keys and mixes one round of DES with one round of AES to reduce the performance time. The W-method deals with
... Show MoreA new design of manifold flow injection (FI) coupling with a merging zone technique was studied for sulfamethoxazole determination spectrophotometrically. The semiautomated FI method has many advantages such as being fast, simple, highly accurate, economical with high throughput . The suggested method based on the production of the orange- colored compound of SMZ with (NQS)1,2-Naphthoquinone-4-Sulphonic acid Sodium salt in alkaline media NaOH at λmax 496nm.The linearity range of sulfamethoxazole was 3-100 μg. mL-1, with (LOD) was 0.593 μg. mL-1 and the RSD% is about 1.25 and the recovery is 100.73%. All various physical and chemical parameters that have an effect on the stability and development of
... Show MoreMost companies use social media data for business. Sentiment analysis automatically gathers analyses and summarizes this type of data. Managing unstructured social media data is difficult. Noisy data is a challenge to sentiment analysis. Since over 50% of the sentiment analysis process is data pre-processing, processing big social media data is challenging too. If pre-processing is carried out correctly, data accuracy may improve. Also, sentiment analysis workflow is highly dependent. Because no pre-processing technique works well in all situations or with all data sources, choosing the most important ones is crucial. Prioritization is an excellent technique for choosing the most important ones. As one of many Multi-Criteria Decision Mak
... Show MoreBackground: Debonding and fracture of artificial teeth from denture bases are common clinical problem, bonding of artificial teeth to heat cure acrylic and high impact heat cure acrylic denture base materials with autoclave processing method is not well known. The aim of this study was to evaluate the effect of autoclave processing method on shear bond of artificial teeth to heat cure denture base material and high impact heat cure denture base material. Materials and methods: Heat polymerized (Vertex) and high impact acrylic (Vertex) acrylic resins were used. Teeth were processed to each of the denture base materials after the application of different surface treatments. The sample (which consist of artificial tooth attached to the dentur
... Show MoreThe investigative film is part of the documentary films and is considered one of the important films that have a high viewership by the recipient, which is of great importance in transmitting information and contributing to creating awareness of the community through the advertising function performed by the cinematic image, as the researcher addresses the concept of advertising and its development through the time stages. And advertising functions. Then the researcher dealt in his research with the cinematic technical elements that contribute to the success of the investigative advertising work, such as camera movements, music, sound effects, dialogue ... etc., which enrich the investigative picture with realism and the flow of informat
... Show MoreImage compression is one of the data compression types applied to digital images in order to reduce their high cost for storage and/or transmission. Image compression algorithms may take the benefit of visual sensitivity and statistical properties of image data to deliver superior results in comparison with generic data compression schemes, which are used for other digital data. In the first approach, the input image is divided into blocks, each of which is 16 x 16, 32 x 32, or 64 x 64 pixels. The blocks are converted first into a string; then, encoded by using a lossless and dictionary-based algorithm known as arithmetic coding. The more occurrence of the pixels values is codded in few bits compare with pixel values of less occurre
... Show More