Abstract: Background: Optical biosensors offer excellent properties and methods for detecting bacteria when compared to traditional analytical techniques. It allows direct detection of many biological and chemical materials. Bacteria are found in the human body naturally non-pathogenic and pathologically, as they are found in other living organisms. One of these bacteria is Escherichia coli (E. coli) which are found in the human body in its natural and pathogenic form. E.coli bacteria cause many diseases, including Stomach, intestines, urinary system infections, and others. The aim of this study: is sensing and differentiation between normal flora and pathogenic E.coli. Material and method: The optical biosensor constructed of a multi-mode – no core- multi mode optical fibre that differentiates between pathogenic and non-pathogenic bacteria of E.coli by measuring the changing for light intensity using source of light 410nm laser diode. Multi-mode - no core - multi-mode optical fibre (MM-NOC-MM) connected to the OSA analyser (HR2000) by means of an adapter and finally connected to a computer to show the results. Results: The intensity of the transmitted light recorded in the case of pathogenic bacteria is less than the intensity of the transmitted light recorded in the case of non-pathogenic bacteria. Conclusion: these results were obtained because of the ideal and better choice of the wavelength of the laser used with its absorption E.coli bacteria.
Optimizing system performance in dynamic and heterogeneous environments and the efficient management of computational tasks are crucial. This paper therefore looks at task scheduling and resource allocation algorithms in some depth. The work evaluates five algorithms: Genetic Algorithms (GA), Particle Swarm Optimization (PSO), Ant Colony Optimization (ACO), Firefly Algorithm (FA) and Simulated Annealing (SA) across various workloads achieved by varying the task-to-node ratio. The paper identifies Finish Time and Deadline as two key performance metrics for gauging the efficacy of an algorithm, and a comprehensive investigation of the behaviors of these algorithms across different workloads was carried out. Results from the experiment
... Show MoreThe present work aims to study the efficiency of using aluminum refuse, which is available locally (after dissolving it in sodium hydroxide), with different coagulants like alum [Al2 (SO4)3.18H2O], Ferric chloride FeCl3 and polyaluminum chloride (PACl) to improve the quality of water. The results showed that using this coagulant in the flocculation process gave high results in the removal of turbidity as well as improving the quality of water by precipitating a great deal of ions causing hardness. From the experimental results of the Jar test, the optimum alum dosages are (25, 50 and 70 ppm), ferric chloride dosages are (15, 40 and 60 ppm) and polyaluminum chloride dosages were (10, 35 and 55 ppm) for initial water turbidity (100, 500 an
... Show More This study includes Estimating scale parameter, location parameter and reliability function for Extreme Value (EXV) distribution by two methods, namely: -
- Maximum Likelihood Method (MLE).
- Probability Weighted Moments Method (PWM).
Used simulations to generate the required samples to estimate the parameters and reliability function of different sizes(n=10,25,50,100) , and give real values for the parameters are and , replicate the simulation experiments (RP=1000)
... Show MoreText categorization refers to the process of grouping text or documents into classes or categories according to their content. Text categorization process consists of three phases which are: preprocessing, feature extraction and classification. In comparison to the English language, just few studies have been done to categorize and classify the Arabic language. For a variety of applications, such as text classification and clustering, Arabic text representation is a difficult task because Arabic language is noted for its richness, diversity, and complicated morphology. This paper presents a comprehensive analysis and a comparison for researchers in the last five years based on the dataset, year, algorithms and the accuracy th
... Show MoreThis paper is concerned with the design and implementation of an image compression method based on biorthogonal tap-9/7 discrete wavelet transform (DWT) and quadtree coding method. As a first step the color correlation is handled using YUV color representation instead of RGB. Then, the chromatic sub-bands are downsampled, and the data of each color band is transformed using wavelet transform. The produced wavelet sub-bands are quantized using hierarchal scalar quantization method. The detail quantized coefficient is coded using quadtree coding followed by Lempel-Ziv-Welch (LZW) encoding. While the approximation coefficients are coded using delta coding followed by LZW encoding. The test results indicated that the compression results are com
... Show MoreMetal oxide nanoparticles, including iron oxide, are highly considered as one of the most important species of nanomaterials in a varied range of applications due to their optical, magnetic, and electrical properties. Iron oxides are common compounds, extensive in nature, and easily synthesized in the laboratory. In this paper, iron oxide nanoparticles were prepared by co-precipitation of (Fe+2) and (Fe+3) ions, using iron (II and III) sulfate as precursor material and NH4OH solution as solvent at 90°C. After the synthesis of iron oxide particles, it was characterized using X-ray diffraction (XRD), infrared spectroscopy (FTIR), and scanning electron microscopy (SEM). These tests confirmed the obtaining o
... Show More