Frictional heat is generated when the clutch starts to engag. As a result of this operation the surface temperature is increased rapidly due to the difference in speed between the driving and driven parts. The influence of the thickness of frictional facing on the distribution of the contact pressure of the multi-disc clutches has been investigated using a numerical approach (the finite element method). The analysis of contact problem has been carried out for a multiple disc dry clutch (piston, clutch discs, separators and pressure plate). The results present the distribution of the contact pressure on all tShe surfaces of friction discs that existed in the friction clutch system. Axisymmetric finite element models have been developed to accomplish the contact analysis in this work. Thickness of the frictional facing of a clutch disc is a significant parameter that affects the elastic and thermal behaviors of a dry friction clutch. The results proved that the magnitudes of the contact pressure are increased dramatically when the thickness of the friction facing decreases.
Beta Distribution
Abstract
Gamma and Beta Distributions has very important in practice in various areas of statistical and applications reliability and quality control of production. and There are a number of methods to generate data behave on according to these distribution. and These methods bassic primarily on the shape parameters of each distribution and the relationship between these distributions and their relationship with some other probability distributions. &nb
... Show MoreThis paper presents the Taguchi approach for optimization of hardness for shape memory alloy (Cu-Al-Ni) . The influence of powder metallurgy parameters on hardness has been investigated. Taguchi technique and ANOVA were used for analysis. Nine experimental runs based on Taguchi’s L9 orthogonal array were performed (OA),for two parameters was study (Pressure and sintering temperature) for three different levels (300 ,500 and 700) MPa ,(700 ,800 and 900)oC respectively . Main effect, signal-to-noise (S/N) ratio was study, and analysis of variance (ANOVA) using to investigate the micro-hardness characteristics of the shape memory alloy .after application the result of study shown the hei
... Show MoreThis paper presents an experimental and theoretical analysis to investigate the two-phase flow boiling heat transfer coefficient and pressure drop of the refrigerant R-134a in the evaporator test section of the refrigeration system under different operating conditions. The test conditions considered are, for heat flux (13.7-36.6) kW/m2, mass flux (52-105) kg/m2.s, vapor quality (0.2-1) and saturation temperature (-15 to -3.7) ˚C. Experiments were carried out using a test rig for a 310W capacity refrigeration system, which is designed and constructed in the current work. Investigating of the experimental results has revealed that, the enhancement in local heat trans
... Show MoreBiomass is a popular renewable carbon source because it has a lot of potential as a substitute for scarce fossil fuels and has been used to make essential compounds like 5-hydroxymethylfurfural (HMF). One of the main components of biomass, glucose, has been extensively studied as a precursor for the production of HMF. Several efforts have been made to find efficient and repeatable procedures for the synthesis of HMF, a chemical platform used in the manufacturing of fuels and other high-value compounds. Sulfonated graphite (SG) was produced from spent dry batteries and utilized as a catalyst to convert glucose to 5-hydroxymethylfurfural (HMF). Temperature, reaction time, and catalyst loading were the variables studied. When dimethyl sulfo
... Show MoreCurrently, the prominence of automatic multi document summarization task belongs to the information rapid increasing on the Internet. Automatic document summarization technology is progressing and may offer a solution to the problem of information overload.
Automatic text summarization system has the challenge of producing a high quality summary. In this study, the design of generic text summarization model based on sentence extraction has been redirected into a more semantic measure reflecting individually the two significant objectives: content coverage and diversity when generating summaries from multiple documents as an explicit optimization model. The proposed two models have been then coupled and def
... Show MoreFluoroscopic images are a field of medical images that depends on the quality of image for correct diagnosis; the main trouble is the de-nosing and how to keep the poise between degradation of noisy image, from one side, and edge and fine details preservation, from the other side, especially when fluoroscopic images contain black and white type noise with high density. The previous filters could usually handle low/medium black and white type noise densities, that expense edge, =fine details preservation and fail with high density of noise that corrupts the images. Therefore, this paper proposed a new Multi-Line algorithm that deals with high-corrupted image with high density of black and white type noise. The experiments achieved i
... Show MoreFinancial fraud remains an ever-increasing problem in the financial industry with numerous consequences. The detection of fraudulent online transactions via credit cards has always been done using data mining (DM) techniques. However, fraud detection on credit card transactions (CCTs), which on its own, is a DM problem, has become a serious challenge because of two major reasons, (i) the frequent changes in the pattern of normal and fraudulent online activities, and (ii) the skewed nature of credit card fraud datasets. The detection of fraudulent CCTs mainly depends on the data sampling approach. This paper proposes a combined SVM- MPSO-MMPSO technique for credit card fraud detection. The dataset of CCTs which co
... Show MoreIn the lifetime process in some systems, most data cannot belong to one single population. In fact, it can represent several subpopulations. In such a case, the known distribution cannot be used to model data. Instead, a mixture of distribution is used to modulate the data and classify them into several subgroups. The mixture of Rayleigh distribution is best to be used with the lifetime process. This paper aims to infer model parameters by the expectation-maximization (EM) algorithm through the maximum likelihood function. The technique is applied to simulated data by following several scenarios. The accuracy of estimation has been examined by the average mean square error (AMSE) and the average classification success rate (ACSR). T
... Show More