The adsorption of Cr (VI) from aqueous solution by spent tea leaves (STL) was studied at different initial Cr (VI) concentrations, adsorbent dose, pH and contact time under batch isotherm experiments The adsorption experiments were carried out at 30°C and the effects of the four parameters on chromium uptake to establish a mathematical model description percentage removal of Cr (VI). The
analysis results showed that the experimental data were adequately fitted to second order polynomial model with correlation coefficients for this model was (R2 = 0.9891). The optimum operating parameters of initial Cr (VI) concentrations, adsorbent dose, pH and contact time were 50 mg/l, 0.7625 g, 3 and 100 min, respectively. At these conditions, the maximum percentage removal of Cr (VI) was 92.88%. The amounts of Cr (VI) adsorbed onto STL were highly affected by the solution pH value. Equilibrium data was modeled with Langmuir and Freundlich models isotherms. Langmuir model is found very well represent the equilibrium data with correlation factor is close to unity than the Freundlich model. The maximum monolayer adsorption capacity was found to be 47.98 mg/g at optimum conditions. The saturated adsorbent was regenerated by base treatment and found to be reuse efficiently after fourth cycle
at optimum conditions as well as for safe disposal of base that contains high concentration of Cr (VI) is precipitated as barium chromate.
Results showed that the optimum conditions for production of inulunase from isolate Kluyveromyces marxianus AY2 by submerged culture could be achieved by using inulin as carbon source at a concentration of 2% with mixture of yeast extract and ammonium sulphate in a ratio of 1:1 in a concentration of 1% at initial pH 5.5 after incubation for 42 hours at 30ºC.
Concentrations of heavy metals (Copper Cu, Iron Fe, Manganese Mn, Cadmium Cd, and Lead Pb) have been studied in river crab Sesarma boulengeri (Outer part of the shield and interior tissues) which caught from two stations in Shatt Al – Arab river (Salhia and Aldeir areas). Elements concentrations were measured by Flame Atomic Absorption Spectrophotometer, concentration of heavy metals in the internal tissues was higher than in the outer shield in both of the stations with the highest value of the elements was to iron 95.21 mg\ kg during the spring as well as copper was 55 mg\kg and manganese was 39.09 mg\kg. The study showed the presence of seasonal changes in the studied heavy metals concentrations values in the tissues of river crab;
... Show MoreECG is an important tool for the primary diagnosis of heart diseases, which shows the electrophysiology of the heart. In our method, a single maternal abdominal ECG signal is taken as an input signal and the maternal P-QRS-T complexes of original signal is averaged and repeated and taken as a reference signal. LMS and RLS adaptive filters algorithms are applied. The results showed that the fetal ECGs have been successfully detected. The accuracy of Daisy database was up to 84% of LMS and 88% of RLS while PhysioNet was up to 98% and 96% for LMS and RLS respectively.
A content-based image retrieval (CBIR) is a technique used to retrieve images from an image database. However, the CBIR process suffers from less accuracy to retrieve images from an extensive image database and ensure the privacy of images. This paper aims to address the issues of accuracy utilizing deep learning techniques as the CNN method. Also, it provides the necessary privacy for images using fully homomorphic encryption methods by Cheon, Kim, Kim, and Song (CKKS). To achieve these aims, a system has been proposed, namely RCNN_CKKS, that includes two parts. The first part (offline processing) extracts automated high-level features based on a flatting layer in a convolutional neural network (CNN) and then stores these features in a
... Show MoreData mining has the most important role in healthcare for discovering hidden relationships in big datasets, especially in breast cancer diagnostics, which is the most popular cause of death in the world. In this paper two algorithms are applied that are decision tree and K-Nearest Neighbour for diagnosing Breast Cancer Grad in order to reduce its risk on patients. In decision tree with feature selection, the Gini index gives an accuracy of %87.83, while with entropy, the feature selection gives an accuracy of %86.77. In both cases, Age appeared as the most effective parameter, particularly when Age<49.5. Whereas Ki67 appeared as a second effective parameter. Furthermore, K- Nearest Neighbor is based on the minimu
... Show MoreIn digital images, protecting sensitive visual information against unauthorized access is considered a critical issue; robust encryption methods are the best solution to preserve such information. This paper introduces a model designed to enhance the performance of the Tiny Encryption Algorithm (TEA) in encrypting images. Two approaches have been suggested for the image cipher process as a preprocessing step before applying the Tiny Encryption Algorithm (TEA). The step mentioned earlier aims to de-correlate and weaken adjacent pixel values as a preparation process before the encryption process. The first approach suggests an Affine transformation for image encryption at two layers, utilizing two different key sets for each layer. Th
... Show MoreText based-image clustering (TBIC) is an insufficient approach for clustering related web images. It is a challenging task to abstract the visual features of images with the support of textual information in a database. In content-based image clustering (CBIC), image data are clustered on the foundation of specific features like texture, colors, boundaries, shapes. In this paper, an effective CBIC) technique is presented, which uses texture and statistical features of the images. The statistical features or moments of colors (mean, skewness, standard deviation, kurtosis, and variance) are extracted from the images. These features are collected in a one dimension array, and then genetic algorithm (GA) is applied for image clustering.
... Show MoreWith the escalation of cybercriminal activities, the demand for forensic investigations into these crimeshas grown significantly. However, the concept of systematic pre-preparation for potential forensicexaminations during the software design phase, known as forensic readiness, has only recently gainedattention. Against the backdrop of surging urban crime rates, this study aims to conduct a rigorous andprecise analysis and forecast of crime rates in Los Angeles, employing advanced Artificial Intelligence(AI) technologies. This research amalgamates diverse datasets encompassing crime history, varioussocio-economic indicators, and geographical locations to attain a comprehensive understanding of howcrimes manifest within the city. Lev
... Show MoreNumeral recognition is considered an essential preliminary step for optical character recognition, document understanding, and others. Although several handwritten numeral recognition algorithms have been proposed so far, achieving adequate recognition accuracy and execution time remain challenging to date. In particular, recognition accuracy depends on the features extraction mechanism. As such, a fast and robust numeral recognition method is essential, which meets the desired accuracy by extracting the features efficiently while maintaining fast implementation time. Furthermore, to date most of the existing studies are focused on evaluating their methods based on clean environments, thus limiting understanding of their potential a
... Show MoreUsing the Internet, nothing is secure and as we are in need of means of protecting our data, the use of passwords has become important in the electronic world. To ensure that there is no hacking and to protect the database that contains important information such as the ID card and banking information, the proposed system stores the username after hashing it using the 256 hash algorithm and strong passwords are saved to repel attackers using one of two methods: -The first method is to add a random salt to the password using the CSPRNG algorithm, then hash it using hash 256 and store it on the website. -The second method is to use the PBKDF2 algorithm, which salts the passwords and extends them (deriving the password) before being ha
... Show More