This research investigates manganese (Mn) extraction from Electric Arc Furnace Steel Slag (EAFS) by using the Liquid-liquid extraction (LLE) method. The chemical analysis was done on the slag using X-ray fluorescence, X-ray diffraction, and atomic absorption spectroscopy. This work consisted of two parts: the first was an extensive study of the effect of variables that can affect the leaching process rate for Mn element from slag (reaction time, nitric acid concentration, solid to liquid ratio, and stirring speed), and the second part evaluates the extraction of Mn element from leached solution. The results showed the possibility of leaching 83.5 % of Mn element from the slag at a temperature of 25°C, nitric acid concentration 2 M, time 90 min, S / L ratio 1/100, and stirring speed 700 rpm. 94.7% extraction of Mn was accomplished from nitric acid solutions by using Octyl Pyro Phosphoric Acid (OPPA) in kerosene at contact time for 12 min, 50%OPPA -kerosene, stirring speed 900 rpm, and organic to the aqueous phase (O/A) of 4/1. Kerosene was the most important diluting agent in extracting Mn, compared to benzene and toluene.
In this paper, a subspace identification method for bilinear systems is used . Wherein a " three-block " and " four-block " subspace algorithms are used. In this algorithms the input signal to the system does not have to be white . Simulation of these algorithms shows that the " four-block " gives fast convergence and the dimensions of the matrices involved are significantly smaller so that the computational complexity is lower as a comparison with " three-block " algorithm .
Interval methods for verified integration of initial value problems (IVPs) for ODEs have been used for more than 40 years. For many classes of IVPs, these methods have the ability to compute guaranteed error bounds for the flow of an ODE, where traditional methods provide only approximations to a solution. Overestimation, however, is a potential drawback of verified methods. For some problems, the computed error bounds become overly pessimistic, or integration even breaks down. The dependency problem and the wrapping effect are particular sources of overestimations in interval computations. Berz (see [1]) and his co-workers have developed Taylor model methods, which extend interval arithmetic with symbolic computations. The latter is an ef
... Show MoreA coin has two sides. Steganography although conceals the existence of a message but is not completely secure. It is not meant to supersede cryptography but to supplement it. The main goal of this method is to minimize the number of LSBs that are changed when substituting them with the bits of characters in the secret message. This will lead to decrease the distortion (noise) that is occurred in the pixels of the stego-image and as a result increase the immunity of the stego-image against the visual attack. The experiment shows that the proposed method gives good enhancement to the steganoraphy technique and there is no difference between the cover-image and the stego-image that can be seen by the human vision system (HVS), so this method c
... Show MoreIncorporating waste byproducts into concrete is an innovative and promising way to minimize the environmental impact of waste material while maintaining and/or improving concrete’s mechanical characteristics and strength. The proper application of sawdust as a pozzolan in the building industry remains a significant challenge. Consequently, this study conducted an experimental evaluation of sawdust as a fill material. In particular, sawdust as a fine aggregate in concrete offers a realistic structural and economical possibility for the construction of lightweight structural systems. Failure under four-point loads was investigated for six concrete-filled steel tube (CFST) specimens. The results indicated that recycled lightweight co
... Show MoreAbstract
The Non - Homogeneous Poisson process is considered as one of the statistical subjects which had an importance in other sciences and a large application in different areas as waiting raws and rectifiable systems method , computer and communication systems and the theory of reliability and many other, also it used in modeling the phenomenon that occurred by unfixed way over time (all events that changed by time).
This research deals with some of the basic concepts that are related to the Non - Homogeneous Poisson process , This research carried out two models of the Non - Homogeneous Poisson process which are the power law model , and Musa –okumto , to estimate th
... Show MoreThis research includes the synthesis of some new N-Aroyl-N \ -Aryl thiourea derivatives namely: N-benzoyl-N \ -(p-aminophenyl) thiourea (STU1), N-benzoyl-N \ -(thiazole) thiourea (STU2), N-acetyl-N ` -(dibenzyl) thiourea (STU3). The series substituted thiourea derivatives were prepared from reaction of acids with thionyl chloride then treating the resulted with potassium thiocyanate to affored the corresponding N-Aroyl isothiocyanates which direct reaction with primary and secondary aryl amines, The purity of the synthesized compounds were checked by measuring the melting point and Thin Layer Chromatography (TLC) and their structure, were identified by spectral methods [FTIR,1H-NMR and 13C-NMR].These compounds were investigated as a
... Show MoreThe prediction process of time series for some time-related phenomena, in particular, the autoregressive integrated moving average(ARIMA) models is one of the important topics in the theory of time series analysis in the applied statistics. Perhaps its importance lies in the basic stages in analyzing of the structure or modeling and the conditions that must be provided in the stochastic process. This paper deals with two methods of predicting the first was a special case of autoregressive integrated moving average which is ARIMA (0,1,1) if the value of the parameter equal to zero, then it is called Random Walk model, the second was the exponential weighted moving average (EWMA). It was implemented in the data of the monthly traff
... Show More
Abstract
Rayleigh distribution is one of the important distributions used for analysis life time data, and has applications in reliability study and physical interpretations. This paper introduces four different methods to estimate the scale parameter, and also estimate reliability function; these methods are Maximum Likelihood, and Bayes and Modified Bayes, and Minimax estimator under squared error loss function, for the scale and reliability function of the generalized Rayleigh distribution are obtained. The comparison is done through simulation procedure, t
... Show MoreThe main objectives of this study were investigating the effects of the maximum size of coarse Attapulgite aggregate and micro steel fiber content on fresh and some mechanical properties of steel fibers reinforced lightweight self-compacting concrete (SFLWSCC). Two series of mixes were used depending on maximum aggregate size (12.5 and 19) mm, for each series three different steel fibers content were used (0.5 %, 1%, and 1.5%). To evaluate the fresh properties, tests of slump flow, T500 mm, V funnel time, and J ring were carried out. Tests of compressive strength, splitting tensile strength, flexural tensile strength, and calculated equilibrium density were done to evaluate mechanical properties. For reference mixes, the
... Show More