For this research, the utilisation of electrocoagulation (EC) toremove theciprofloxacin (CIP) and levofloxacin (LVX) from aqueous solutions was examined. The effective removal efficiencies are 93.47% for CIP and 88.00% for LVX, under optimum conditions. The adsorption isotherm models with suitable mechanisms were applied to determine the elimination of CIP and LVX utilizingtheEC method. Thefindingsshowed the adsorption of CIP and LVX on iron hydroxide flocs followed the Sips isotherm, with correlation coefficient values (R2) of 0.939 and 0.937. Threekinetic models were reviewed to determine the accurate CIP and LVX elimination methods using the EC method. The results showed that itfittedfor the second-order model, which indicated that the chemical adsorption mechanism controlled the removal of CIP and LVX. The R2 with CIP is 0.944, and LVX is 0.941. For binary system removal, efficiencies were 93.00, 90.10, and 96.30% for CIP, and 91.80, 96.10, and 92.97% for LVX, at the CIP ratio: LVX of 1:1, 1:4, and 4:1. The electrode consumption (ELC) and electrical energy consumption (EEC) were found at 0.208 g and 3.21 kWh−3 for a single operation. The operating cost was estimated at 0.613 US$ m−3
The feature extraction step plays major role for proper object classification and recognition, this step depends mainly on correct object detection in the given scene, the object detection algorithms may result with some noises that affect the final object shape, a novel approach is introduced in this paper for filling the holes in that object for better object detection and for correct feature extraction, this method is based on the hole definition which is the black pixel surrounded by a connected boundary region, and hence trying to find a connected contour region that surrounds the background pixel using roadmap racing algorithm, the method shows a good results in 2D space objects.
Keywords: object filling, object detection, objec
Steganography can be defined as the art and science of hiding information in the data that could be read by computer. This science cannot recognize stego-cover and the original one whether by eye or by computer when seeing the statistical samples. This paper presents a new method to hide text in text characters. The systematic method uses the structure of invisible character to hide and extract secret texts. The creation of secret message comprises four main stages such using the letter from the original message, selecting the suitable cover text, dividing the cover text into blocks, hiding the secret text using the invisible character and comparing the cover-text and stego-object. This study uses an invisible character (white space
... Show MoreOver the past few years, ear biometrics has attracted a lot of attention. It is a trusted biometric for the identification and recognition of humans due to its consistent shape and rich texture variation. The ear presents an attractive solution since it is visible, ear images are easily captured, and the ear structure remains relatively stable over time. In this paper, a comprehensive review of prior research was conducted to establish the efficacy of utilizing ear features for individual identification through the employment of both manually-crafted features and deep-learning approaches. The objective of this model is to present the accuracy rate of person identification systems based on either manually-crafted features such as D
... Show MoreThis dissertation depends on study of the topological structure in graph theory as well as introduce some concerning concepts, and generalization them into new topological spaces constructed using elements of graph. Thus, it is required presenting some theorems, propositions, and corollaries that are available in resources and proof which are not available. Moreover, studying some relationships between many concepts and examining their equivalence property like locally connectedness, convexity, intervals, and compactness. In addition, introducing the concepts of weaker separation axioms in α-topological spaces than the standard once like, α-feebly Hausdorff, α-feebly regular, and α-feebly normal and studying their properties. Furthermor
... Show MoreThis paper is concerned with introducing and studying the first new approximation operators using mixed degree system and second new approximation operators using mixed degree system which are the core concept in this paper. In addition, the approximations of graphs using the operators first lower and first upper are accurate then the approximations obtained by using the operators second lower and second upper sincefirst accuracy less then second accuracy. For this reason, we study in detail the properties of second lower and second upper in this paper. Furthermore, we summarize the results for the properties of approximation operators second lower and second upper when the graph G is arbitrary, serial 1, serial 2, reflexive, symmetric, tra
... Show MoreIn cognitive radio networks, there are two important probabilities; the first probability is important to primary users called probability of detection as it indicates their protection level from secondary users, and the second probability is important to the secondary users called probability of false alarm which is used for determining their using of unoccupied channel. Cooperation sensing can improve the probabilities of detection and false alarm. A new approach of determine optimal value for these probabilities, is supposed and considered to face multi secondary users through discovering an optimal threshold value for each unique detection curve then jointly find the optimal thresholds. To get the aggregated throughput over transmission
... Show More