This paper describes a new finishing process using magnetic abrasives were newly made to finish effectively brass plate that is very difficult to be polished by the conventional machining processes. Taguchi experimental design method was adopted for evaluating the effect of the process parameters on the improvement of the surface roughness and hardness by the magnetic abrasive polishing. The process parameters are: the applied current to the inductor, the working gap between the workpiece and the inductor, the rotational speed and the volume of powder. The analysis of variance(ANOVA) was analyzed using statistical software to identify the optimal conditions for better surface roughness and hardness. Regressions models based on statistical mathematical approach by using the MINITAB-statistical software for both surface roughness and hardness were obtained. Experimental results indicated that rotational speed is the most significant parameters on change in surface roughness(ΔRa), and for change in surface hardness (ΔHa), volume of powder is the significant one. As a result, it was seen that the magnetic abrasive polishing was very useful for finishing the brass alloy plate.
Copper oxide thin films were deposited on glass substrate using Successive Ionic Layer Adsorption and Reaction (SILAR) method at room temperature. The thickness of the thin films was around 0.43?m.Copper oxide thin films were annealed in air at (200, 300 and 400°C for 45min.The film structure properties were characterized by x-ray diffraction (XRD). XRD patterns indicated the presence of polycrystalline CuO. The average grain size is calculated from the X-rays pattern, it is found that the grain size increased with increasing annealing temperature. Optical transmitter microscope (OTM) and atomic force microscope (AFM) was also used. Direct band gap values of 2.2 eV for an annealed sample and (2, 1.5, 1.4) eV at 200, 300,400oC respect
... Show MoreMost available methods for unit hydrographs (SUH) derivation involve manual, subjective fitting of
a hydrograph through a few data points. The use of probability distributions for the derivation of synthetic
hydrographs had received much attention because of its similarity with unit hydrograph properties. In this
paper, the use of two flexible probability distributions is presented. For each distribution the unknown
parameters were derived in terms of the time to peak(tp), and the peak discharge(Qp). A simple Matlab
program is prepared for calculating these parameters and their validity was checked using comparison
with field data. Application to field data shows that the gamma and lognormal distributions had fit well.<
The increasing availability of computing power in the past two decades has been use to develop new techniques for optimizing solution of estimation problem. Today's computational capacity and the widespread availability of computers have enabled development of new generation of intelligent computing techniques, such as our interest algorithm, this paper presents one of new class of stochastic search algorithm (known as Canonical Genetic' Algorithm ‘CGA’) for optimizing the maximum likelihood function strategy is composed of three main steps: recombination, mutation, and selection. The experimental design is based on simulating the CGA with different values of are compared with those of moment method. Based on MSE value obtained from bot
... Show MoreThe temperature influence on the fluorescence lifetime, quantum yields and non-radiative rate parameter or coumarin 460 dye dissolved in methanol was investigated in the temperature range (160-300 k). A single photon counting technique was used or measuring the fluorescence decay curves. A noticeable decrease of the fluorescence lifetime with increasing the temperature was observed. The non-radiative activation energy of 10.57 K.J. mole-1 was measured by the help of Arrhenius plot.
The aim of this paper to find Bayes estimator under new loss function assemble between symmetric and asymmetric loss functions, namely, proposed entropy loss function, where this function that merge between entropy loss function and the squared Log error Loss function, which is quite asymmetric in nature. then comparison a the Bayes estimators of exponential distribution under the proposed function, whoever, loss functions ingredient for the proposed function the using a standard mean square error (MSE) and Bias quantity (Mbias), where the generation of the random data using the simulation for estimate exponential distribution parameters different sample sizes (n=10,50,100) and (N=1000), taking initial
... Show MoreThe present study investigated the use of pretreated fish bone (PTFB) as a new surface, natural waste and low-cost adsorbent for the adsorption of Methyl green (MG, as model toxic basic dye) from aqueous solutions. The functional groups and surface morphology of the untreated fish bone (FB) and pretreated fish bone were characterized using Fourier transform infrared (FTIR), scanning electron microscopy (SEM) and Energy dispersive X-ray spectroscopy (EDS),respectively. The effect of operating parameters including contact time, pH, adsorbent dose, temperature, and inorganic salt was evaluated. Langmuir, Freundlich and Temkin adsorption isotherm models were studied and the results showed that the adsorption of basic dye followed Freundlich iso
... Show MoreThe present study investigated the use of pretreated fish bone (PTFB) as a new surface, natural waste and low-cost adsorbent for the adsorption of Methyl green (MG, as model toxic basic dye) from aqueous solutions. The functional groups and surface morphology of the untreated fish bone (FB) and pretreated fish bone were characterized using Fourier transform infrared (FTIR), scanning electron microscopy (SEM) and Energy dispersive X-ray spectroscopy (EDS), respectively. The effect of operating parameters including contact time, pH, adsorbent dose, temperature, and inorganic salt was evaluated. Langmuir, Freundlich and Temkin adsorption isotherm models were studied and the results showe
The biometric-based keys generation represents the utilization of the extracted features from the human anatomical (physiological) traits like a fingerprint, retina, etc. or behavioral traits like a signature. The retina biometric has inherent robustness, therefore, it is capable of generating random keys with a higher security level compared to the other biometric traits. In this paper, an effective system to generate secure, robust and unique random keys based on retina features has been proposed for cryptographic applications. The retina features are extracted by using the algorithm of glowworm swarm optimization (GSO) that provides promising results through the experiments using the standard retina databases. Additionally, in order t
... Show MoreA condense study was done to compare between the ordinary estimators. In particular the maximum likelihood estimator and the robust estimator, to estimate the parameters of the mixed model of order one, namely ARMA(1,1) model.
Simulation study was done for a varieties the model. using: small, moderate and large sample sizes, were some new results were obtained. MAPE was used as a statistical criterion for comparison.