In this study, a low-cost biosorbent, dead mushroom biomass (DMB) granules, was used for investigating the optimum conditions of Pb(II), Cu(II), and Ni(II) biosorption from aqueous solutions. Various physicochemical parameters, such as initial metal ion concentration, equilibrium time, pH value, agitation speed, particles diameter, and adsorbent dosage, were studied. Five mathematical models describing the biosorption equilibrium and isotherm constants were tested to find the maximum uptake capacities: Langmuir, Freundlich, Redlich-Peterson, Sips, and Khan models. The best fit to the Pb(II) and Ni(II) biosorption results was obtained by Langmuir model with maximum uptake capacities of 44.67 and 29.17 mg/g for these two ions, respectively, whereas for Cu(II), the corresponding value was 31.65 mg/g obtained with Khan model. The kinetic study demonstrated that the optimum agitation speed was 400 rpm, at which the best removal efficiency and/or minimum surface mass transfer resistance (MSMTR) was achieved. A pseudo-second-order rate kinetic model gave the best fit to the experimental data (R2 = 0.99), resulting in MSMTR values of 4.69× 10−5, 4.45× 10−6, and 1.12× 10−6 m/s for Pb(II), Cu(II), and Ni(II), respectively. The thermodynamic study showed that the biosorption process was spontaneous and exothermic in nature.
The growing use of tele
This paper presents a new secret diffusion scheme called Round Key Permutation (RKP) based on the nonlinear, dynamic and pseudorandom permutation for encrypting images by block, since images are considered particular data because of their size and their information, which are two-dimensional nature and characterized by high redundancy and strong correlation. Firstly, the permutation table is calculated according to the master key and sub-keys. Secondly, scrambling pixels for each block to be encrypted will be done according the permutation table. Thereafter the AES encryption algorithm is used in the proposed cryptosystem by replacing the linear permutation of ShiftRows step with the nonlinear and secret pe
... Show MoreThis research aims at studying the relation between fair value and the Financial Reports Quality to achieve a number of aims such as :-
1- Throw light on the problems of the measurement that depends on the historic cost as it paves the way towards the method of the fair value in the accounting measurement.
2-Give a general definition for fair value in the accounting via analyzing the theoretical aspects that relates the subject and the scientific bases on which the relating accounting treatment depend.
3- Exhibit the characteristics that could be added by the fair value to the accounting Information .
The study problem is summarized in that the e
... Show MoreSegmentation is the process of partition digital images into different parts depending on texture, color, or intensity, and can be used in different fields in order to segment and isolate the area to be partitioned. In this work images of the Moon obtained through observations in Astronomy and space dep. College of science university of Baghdad by ( Toward space telescopes and widespread used of a CCD camera) . Different segmentation methods were used to segment lunar craters. Different celestial objects cause craters when they crash into the surface of the Moon like asteroids and meteorites. Thousands of craters appears on the Moon's surface with ranges in size from meter to many kilometers, it provide insights into the age and geology
... Show MoreOptimizing system performance in dynamic and heterogeneous environments and the efficient management of computational tasks are crucial. This paper therefore looks at task scheduling and resource allocation algorithms in some depth. The work evaluates five algorithms: Genetic Algorithms (GA), Particle Swarm Optimization (PSO), Ant Colony Optimization (ACO), Firefly Algorithm (FA) and Simulated Annealing (SA) across various workloads achieved by varying the task-to-node ratio. The paper identifies Finish Time and Deadline as two key performance metrics for gauging the efficacy of an algorithm, and a comprehensive investigation of the behaviors of these algorithms across different workloads was carried out. Results from the experiment
... Show More