This study assessed the advantage of using earthworms in combination with punch waste and nutrients in remediating drill cuttings contaminated with hydrocarbons. Analyses were performed on day 0, 7, 14, 21, and 28 of the experiment. Two hydrocarbon concentrations were used (20000 mg/kg and 40000 mg/kg) for three groups of earthworms number which were five, ten and twenty earthworms. After 28 days, the total petroleum hydrocarbon (TPH) concentration (20000 mg/kg) was reduced to 13200 mg/kg, 9800 mg/kg, and 6300 mg/kg in treatments with five, ten and twenty earthworms respectively. Also, TPH concentration (40000 mg/kg) was reduced to 22000 mg/kg, 10100 mg/kg, and 4200 mg/kg in treatments with the above number of earthworms respectively. The present study revealed that the trend of degradation was observed to increase significantly with an increase in Earthworms number and with an increase in number of tested days. The results of this study have shown that TPH with certain concentrations can be reduced to acceptable levels by using the selected earthworms named Allolobophora. Also the study results revealed that the present bioremediation can be considered an additional option to deal with the local petroleum contaminated sites.
This paper is devoted to an inverse problem of determining discontinuous space-wise dependent heat source in a linear parabolic equation from the measurements at the final moment. In the existing literature, a considerably accurate solution to the inverse problems with an unknown space-wise dependent heat source is impossible without introducing any type of regularization method but here we have to determine the unknown discontinuous space-wise dependent heat source accurately using the Haar wavelet collocation method (HWCM) without applying the regularization technique. This HWCM is based on finite-difference and Haar wavelets approximation to the inverse problem. In contrast to othe
Quantum key distribution (QKD) provides unconditional security in theory. However, practical QKD systems face challenges in maximizing the secure key rate and extending transmission distances. In this paper, we introduce a comparative study of the BB84 protocol using coincidence detection with two different quantum channels: a free space and underwater quantum channels. A simulated seawater was used as an example for underwater quantum channel. Different single photon detection modules were used on Bob’s side to capture the coincidence counts. Results showed that increasing the mean photon number generally leads to a higher rate of coincidence detection and therefore higher possibility of increasing the secure key rate. The secure key rat
... Show MoreIn this work, a fiber-optic biomedical sensor was manufactured to detect hemoglobin percentages in the blood. SPR-based coreless optical fibers were developed and implemented using single and multiple optical fibers. It was also used to calculate refractive indices and concentrations of hemoglobin in blood samples. An optical fiber, with a thickness of 40 nanometers, was deposited on gold metal for the sensing area to increase the sensitivity of the sensor. The optical fiber used in this work has a diameter of 125μm, no core, and is made up of a pure silica glass rod and an acrylate coating. The length of the fiber was 4cm removed buffer and the splicing process was done. It is found in practice that when the sensitive refractive i
... Show MoreThe objective of an Optimal Power Flow (OPF) algorithm is to find steady state operation point which minimizes generation cost, loss etc. while maintaining an acceptable system performance in terms of limits on generators real and reactive powers, line flow limits etc. The OPF solution includes an objective function. A common objective function concerns the active power generation cost. A Linear programming method is proposed to solve the OPF problem. The Linear Programming (LP) approach transforms the nonlinear optimization problem into an iterative algorithm that in each iteration solves a linear optimization problem resulting from linearization both the objective function and constrains. A computer program, written in MATLAB environme
... Show MoreThe development of information systems in recent years has contributed to various methods of gathering information to evaluate IS performance. The most common approach used to collect information is called the survey system. This method, however, suffers one major drawback. The decision makers consume considerable time to transform data from survey sheets to analytical programs. As such, this paper proposes a method called ‘survey algorithm based on R programming language’ or SABR, for data transformation from the survey sheets inside R environments by treating the arrangement of data as a relational format. R and Relational data format provide excellent opportunity to manage and analyse the accumulated data. Moreover, a survey syste
... Show MoreIn this work, a new development of predictive voltage-tracking control algorithm for Proton Exchange Membrane Fuel Cell (PEMFCs) model, using a neural network technique based on-line auto-tuning intelligent algorithm was proposed. The aim of proposed robust feedback nonlinear neural predictive voltage controller is to find precisely and quickly the optimal hydrogen partial pressure action to control the stack terminal voltage of the (PEMFC) model for N-step ahead prediction. The Chaotic Particle Swarm Optimization (CPSO) implemented as a stable and robust on-line auto-tune algorithm to find the optimal weights for the proposed predictive neural network controller to improve system performance in terms of fast-tracking de
... Show MoreSurface Plasmon Resonance (SPR)-based plastic optical fiber sensor for estimating the concentration and refractive index of sugar in human blood serum. The sensor is fabricated by a small part (10mm) of optical fiber in the middle is embedded in a resin block and then the polishing process is done, after that it is deposited with about (40nm) thickness of gold metal. The blood serum is placed on gold coated core of an Optical grade plastic optical fiber of 980 µm core diameter.
The evolution of the Internet of things (IoT) led to connect billions of heterogeneous physical devices together to improve the quality of human life by collecting data from their environment. However, there is a need to store huge data in big storage and high computational capabilities. Cloud computing can be used to store big data. The data of IoT devices is transferred using two types of protocols: Message Queuing Telemetry Transport (MQTT) and Hypertext Transfer Protocol (HTTP). This paper aims to make a high performance and more reliable system through efficient use of resources. Thus, load balancing in cloud computing is used to dynamically distribute the workload across nodes to avoid overloading any individual r
... Show More