This paper describes a new finishing process using magnetic abrasives were newly made to finish effectively brass plate that is very difficult to be polished by the conventional machining processes. Taguchi experimental design method was adopted for evaluating the effect of the process parameters on the improvement of the surface roughness and hardness by the magnetic abrasive polishing. The process parameters are: the applied current to the inductor, the working gap between the workpiece and the inductor, the rotational speed and the volume of powder. The analysis of variance(ANOVA) was analyzed using statistical software to identify the optimal conditions for better surface roughness and hardness. Regressions models based on statistical mathematical approach by using the MINITAB-statistical software for both surface roughness and hardness were obtained. Experimental results indicated that rotational speed is the most significant parameters on change in surface roughness(ΔRa), and for change in surface hardness (ΔHa), volume of powder is the significant one. As a result, it was seen that the magnetic abrasive polishing was very useful for finishing the brass alloy plate.
Predicting the network traffic of web pages is one of the areas that has increased focus in recent years. Modeling traffic helps find strategies for distributing network loads, identifying user behaviors and malicious traffic, and predicting future trends. Many statistical and intelligent methods have been studied to predict web traffic using time series of network traffic. In this paper, the use of machine learning algorithms to model Wikipedia traffic using Google's time series dataset is studied. Two data sets were used for time series, data generalization, building a set of machine learning models (XGboost, Logistic Regression, Linear Regression, and Random Forest), and comparing the performance of the models using (SMAPE) and
... Show MoreSpeech is the essential way to interact between humans or between human and machine. However, it is always contaminated with different types of environment noise. Therefore, speech enhancement algorithms (SEA) have appeared as a significant approach in speech processing filed to suppress background noise and return back the original speech signal. In this paper, a new efficient two-stage SEA with low distortion is proposed based on minimum mean square error sense. The estimation of clean signal is performed by taking the advantages of Laplacian speech and noise modeling based on orthogonal transform (Discrete Krawtchouk-Tchebichef transform) coefficients distribution. The Discrete Kra
R. Vasuki [1] proved fixed point theorems for expansive mappings in Menger spaces. R. Gujetiya and et al [2] presented an extension of the main result of Vasuki, for four expansive mappings in Menger space. In this article, an important lemma is given to prove that the iteration sequence is Cauchy under suitable condition in Menger probabilistic G-metric space (shortly, MPGM-space). And then, used to obtain three common fixed point theorems for expansive type mappings.
During the two last decades ago, audio compression becomes the topic of many types of research due to the importance of this field which reflecting on the storage capacity and the transmission requirement. The rapid development of the computer industry increases the demand for audio data with high quality and accordingly, there is great importance for the development of audio compression technologies, lossy and lossless are the two categories of compression. This paper aims to review the techniques of the lossy audio compression methods, summarize the importance and the uses of each method.
Abstract: To study the effect of nickel chloride on bone composition of mice, a number of biophysical and biochemical parameters have been made use. The animals were divided into control and experimental and further subdivided into three groups I, II and III according to the dose of nickel chloride (NiCl2) administered to them i.e. 5.8, 12.8 and 28.2 mg/kg body weight, respectively. Femur bones were obtained by sacrificing the animals three weeks after weaning them once a week. The percentage loss between the wet weight and dry weight of femur in control animals was found to be 32.5+1.5 .In the three experimental groups I,II and III, the percentage loss was 30.4+1.4, 35.3+2.3 and 38.9+2.2 respectively. The percentage loss between the wet we
... Show MoreThe adopted accelerated curing methods in the experimental work are 55ºC and 82ºC according to British standard methods. The concrete mix with the characteristics compressive strength of 35MPa is design according to the ACI 211.1, the mix proportion is (1:2.65:3.82) for cement, fine and coarse aggregate, respectively. The concrete reinforced with different volume fraction (0.25, 0.5 and 0.75)% of glass, carbon and polypropylene fibers. The experimental results showed that the accelerated curing method using 82ºC gives a compressive strength higher than 55ºC method for all concrete mixes. In addition, the fiber reinforced concrete with 0.75% gives the maximum compressive strength, flexural and splitting tensile strength for all types of
... Show MoreIn this paper we introduce a new type of functions called the generalized regular
continuous functions .These functions are weaker than regular continuous functions and
stronger than regular generalized continuous functions. Also, we study some
characterizations and basic properties of generalized regular continuous functions .Moreover
we study another types of generalized regular continuous functions and study the relation
among them
The huge amount of documents in the internet led to the rapid need of text classification (TC). TC is used to organize these text documents. In this research paper, a new model is based on Extreme Machine learning (EML) is used. The proposed model consists of many phases including: preprocessing, feature extraction, Multiple Linear Regression (MLR) and ELM. The basic idea of the proposed model is built upon the calculation of feature weights by using MLR. These feature weights with the extracted features introduced as an input to the ELM that produced weighted Extreme Learning Machine (WELM). The results showed a great competence of the proposed WELM compared to the ELM.