In this study, the photodegradation of Congo red dye (CR) in aqueous solution was investigated using Au-Pd/TiO2 as photocatalyst. The concentration of dye, dosage of photocatalyst, amount of H2O2, pH of the medium and temperature were examined to find the optimum values of these parameters. It has been found that 28 ppm was the best dye concentration. The optimum amount of photocatalyst was 0.09 g/75 mL of dye solution when the degradation percent was ~ 96 % after irradiation time of 12 hours, while the best amount of hydrogen peroxide was 7μl/75 mL of dye solution at degradation percent ~97 % after irradiation time of 10 hours, whereas pH 5 was the best value to carry out the reaction at the highest degradation percent. In addition, temperature tested at range of (25-55) C˚, and it has been figured out which photodegradation percent of dye increase with raising temperature (degradation percent was ~ 98% after irradiation time of 4 hours at 55 C˚), and the activation energy of the reaction was calculated (34.8016 kJ/mole) from Arrhenius law. The thermodynamic functions ΔH#, ΔG#, and ΔS# were obtained, where ΔH# and ΔG# are positive value which means that the reaction is endothermic and non-spontaneous respectively, while ΔS# has a negative value, thus indicates that the reactants are more disordered than the excited intermediate formed. The kinetic of the reaction was studied, and it has been found that the photocatalytic reaction follows pseudo first order reaction.
Perchloroethylene (PERC) is commonly used as a dry-cleaning solvent, it is attributed to many deleterious effects in the biological system. The study aimed to investigate the harmful effect associated with PERC exposure among dry-cleaning workers. The study was carried out on 58 adults in two groups. PERC-exposed group; include thirty-two male dry-cleaning workers using PERC as a dry-cleaning solvent and twenty-six healthy non-exposed subjects. History of PERC exposure, use of personal protection equipment (PPE), safety measurement of the exposed group was recorded. Blood sample was taken from each participant for measurement of hematological markers, liver and kidney function tests. The results showed that 28.1% of the workers were usin
... Show More
It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jackna
... Show MoreData centric techniques, like data aggregation via modified algorithm based on fuzzy clustering algorithm with voronoi diagram which is called modified Voronoi Fuzzy Clustering Algorithm (VFCA) is presented in this paper. In the modified algorithm, the sensed area divided into number of voronoi cells by applying voronoi diagram, these cells are clustered by a fuzzy C-means method (FCM) to reduce the transmission distance. Then an appropriate cluster head (CH) for each cluster is elected. Three parameters are used for this election process, the energy, distance between CH and its neighbor sensors and packet loss values. Furthermore, data aggregation is employed in each CH to reduce the amount of data transmission which le
... Show MoreNowadays power systems are huge networks that consist of electrical energy sources, static and lumped load components, connected over long distances by A.C. transmission lines. Voltage improvement is an important aspect of the power system. If the issue is not dealt with properly, may lead to voltage collapse. In this paper, HVDC links/bipolar connections were inserted in a power system in order to improve the voltage profile. The load flow was simulated by Electrical Transient Analyzer Program (ETAP.16) program in which Newton- Raphson method is used. The load flow simulation studies show a significant enhancement of the power system performance after applying HVDC links on Kurdistan power systems. Th
... Show MoreThe main goal of this in vivo study was to evaluate the effect of 532nm Q-switched Nd: YAG Laser in combination with Human Serum Albumin 20% concentration (as a welding aid) on the liver tissue repair clinically, and histologically. The animals used in this study were 21 male rabbits divided into three main groups: control group (3 rabbits), conventionally treated group (9 rabbits) and Laser treated group (9 rabbits). Each two main groups (conventional and laser treated) consist of three sub-groups depending on the response evaluation at three different periods. The Laser group was treated using 532nm Q-switched Nd: YAG laser after adding human serum albumin immediately on the incised liver’s tissue. The energy of was 460mJ, and 4Hz fr
... Show MoreThe aim of this paper is to present a weak form of -light functions by using -open set which is -light function, and to offer new concepts of disconnected spaces and totally disconnected spaces. The relation between them have been studied. Also, a new form of -totally disconnected and inversely -totally disconnected function have been defined, some examples and facts was submitted.
The support vector machine, also known as SVM, is a type of supervised learning model that can be used for classification or regression depending on the datasets. SVM is used to classify data points by determining the best hyperplane between two or more groups. Working with enormous datasets, on the other hand, might result in a variety of issues, including inefficient accuracy and time-consuming. SVM was updated in this research by applying some non-linear kernel transformations, which are: linear, polynomial, radial basis, and multi-layer kernels. The non-linear SVM classification model was illustrated and summarized in an algorithm using kernel tricks. The proposed method was examined using three simulation datasets with different sample
... Show More This study includes Estimating scale parameter, location parameter and reliability function for Extreme Value (EXV) distribution by two methods, namely: -
- Maximum Likelihood Method (MLE).
- Probability Weighted Moments Method (PWM).
Used simulations to generate the required samples to estimate the parameters and reliability function of different sizes(n=10,25,50,100) , and give real values for the parameters are and , replicate the simulation experiments (RP=1000)
... Show MoreThis paper introduces some properties of separation axioms called α -feeble regular and α -feeble normal spaces (which are weaker than the usual axioms) by using elements of graph which are the essential parts of our α -topological spaces that we study them. Also, it presents some dependent concepts and studies their properties and some relationships between them.