Water contamination is a pressing global concern, especially regarding the presence of nitrate ions. This research focuses on addressing this issue by developing an effective adsorbent for removing nitrate ions from aqueous solutions. two adsorbents Chitosan-Zeolite-Zirconium (Cs-Ze-Zr composite beads and Chitosan-Bentonite-Zirconium Cs-Bn-Zr composite beads were prepared. The study involved continuous experimentation using a fixed bed column with varying bed heights (1.5 and 3 cm) and inlet flow rates (1 and 3 ml/min). The results showed that the breakthrough time increased with higher bed heights for both Cs-Ze-Zr and Cs-Bn-Zr composite beads. Conversely, an increase in flow rate led to a decrease in breakthrough time. Notably, Cs-Ze-Zr and Cs-Bn-Zr demonstrated impressive removal efficiencies, reaching 87.23% and 92.02%, respectively. The optimal conditions for peak performance were found to be an inlet flow rate of 1 ml/min, a bed height of 3 cm, and initial concentrations of 400 mg/L and 600 mg/L for Cs-Ze-Zr and Cs-Bn-Zr, respectively.
A finite element is a study that is capable of predicting crack initiation and simulating crack propagation of human bone. The material model is implemented in MATLAB finite element package, which allows extension to any geometry and any load configuration. The fracture mechanics parameters for transverse and longitudinal crack propagation in human bone are analyzed. A fracture toughness as well as stress and strain contour are generated and thoroughly evaluated. Discussion is given on how this knowledge needs to be extended to allow prediction of whole bone fracture from external loading to aid the design of protective systems.
Support vector machines (SVMs) are supervised learning models that analyze data for classification or regression. For classification, SVM is widely used by selecting an optimal hyperplane that separates two classes. SVM has very good accuracy and extremally robust comparing with some other classification methods such as logistics linear regression, random forest, k-nearest neighbor and naïve model. However, working with large datasets can cause many problems such as time-consuming and inefficient results. In this paper, the SVM has been modified by using a stochastic Gradient descent process. The modified method, stochastic gradient descent SVM (SGD-SVM), checked by using two simulation datasets. Since the classification of different ca
... Show MoreWe have investigated in this research, the contents of the electronic cigarette (Viber) and the emergence of the phenomenon of electronic smoking (vibing) were discussed, although the topic of smoking is one of the oldest topics on which many articles and research have been conducted, electronic smoking has not been studied according to statistical scientific research, we tried in this research to identify the concept of electronic smoking to sample the studied data and to deal with it in a scientific way. This research included conducting a statistical analysis using the factor analysis of a sample taken randomly from some colleges in Bab Al-medium in Baghdad with a size of (70) views where (КМО) and a (bartlett) tests
... Show MoreThe objective of this paper is to improve the general quality of infrared images by proposes an algorithm relying upon strategy for infrared images (IR) enhancement. This algorithm was based on two methods: adaptive histogram equalization (AHE) and Contrast Limited Adaptive Histogram Equalization (CLAHE). The contribution of this paper is on how well contrast enhancement improvement procedures proposed for infrared images, and to propose a strategy that may be most appropriate for consolidation into commercial infrared imaging applications.
The database for this paper consists of night vision infrared images were taken by Zenmuse camera (FLIR Systems, Inc) attached on MATRIC100 drone in Karbala city. The experimental tests showed sign
The question of estimation took a great interest in some engineering, statistical applications, various applied, human sciences, the methods provided by it helped to identify and accurately the many random processes.
In this paper, methods were used through which the reliability function, risk function, and estimation of the distribution parameters were used, and the methods are (Moment Method, Maximum Likelihood Method), where an experimental study was conducted using a simulation method for the purpose of comparing the methods to show which of these methods are competent in practical application This is based on the observations generated from the Rayleigh logarithmic distribution (RL) with sample sizes
... Show MoreInfluence of combined square nozzle with helical tape inserted in a constant heat flux tube on heat transfer enhancement for turbulent airflow for Reynolds number ranging from 7000 to 14500 were investigated experimentally. Three different pitch ratios for square nozzle (PR = 5.8, 7.7 and 11.6) according to three different numbers of square nozzle (N = 3, 4 and 5) and constant pitch ratios for helical tape were used. The results observed that the Nusselt number and friction factor for combination with winglets were found to be up to 33.8 % and 21.4 %, respectively higher than nozzle alone for pitch ratio PR=5.8. The maximum value of thermal performance for using combination with winglets was about 1.351 for pitch ratio= 5.8. Nusselt numb
... Show MoreWeibull distribution is considered as one of the most widely distribution applied in real life, Its similar to normal distribution in the way of applications, it's also considered as one of the distributions that can applied in many fields such as industrial engineering to represent replaced and manufacturing time ,weather forecasting, and other scientific uses in reliability studies and survival function in medical and communication engineering fields.
In this paper, The scale parameter has been estimated for weibull distribution using Bayesian method based on Jeffery prior information as a first method , then enhanced by improving Jeffery prior information and then used as a se
... Show MoreFree Radical Copolymerization of Styrene/ Methyl Methacrylate were prepared chemically under Nitrogen ,which was investigated, in the present of Benzoyl Peroxide as Initiator at concentration of 2 × 10-3 molar at 70 °C, which was carried out in Benzene as solvent to a certain low conversion . FT-IR spectra were used for determining of the monomer reactivity ratios ,which was obtained by employing the conventional linearization method of Fineman-Ross (F-R) and Kelen-Tüdos (K- T). The experimental results showed the average value for the Styrene r1 / Methyl Methacrylate r2 system, Sty r1 = 0.45 , MMA r2 = 0.38 in the (F–R) Method and r1 = 0.49 , r2 = 0.35 in the (K–T) Method, The Results of this indicated show the random distri
... Show MoreThe esterification of oleic acid with 2-ethylhexanol in presence of sulfuric acid as homogeneous catalyst was investigated in this work to produce 2-ethylhexyl oleate (biodiesel) by using semi batch reactive distillation. The effect of reaction temperature (100 to 130°C), 2-ethylhexanol:oleic acid molar ratio (1:1 to 1:3) and catalysts concentration (0.2 to 1wt%) were studied. Higher conversion of 97% was achieved with operating conditions of reaction temperature of 130°C, molar ratio of free fatty acid to alcohol of 1:2 and catalyst concentration of 1wt%. A simulation was adopted from basic principles of the reactive distillation using MATLAB to describe the process. Good agreement was achieved.
This research aims to study the methods of reduction of dimensions that overcome the problem curse of dimensionality when traditional methods fail to provide a good estimation of the parameters So this problem must be dealt with directly . Two methods were used to solve the problem of high dimensional data, The first method is the non-classical method Slice inverse regression ( SIR ) method and the proposed weight standard Sir (WSIR) method and principal components (PCA) which is the general method used in reducing dimensions, (SIR ) and (PCA) is based on the work of linear combinations of a subset of the original explanatory variables, which may suffer from the problem of heterogeneity and the problem of linear
... Show More