The development of efficient and environmentally friendly catalysts for the electro-oxidation of hydrazine derivatives is of great importance in various industrial applications. In this study, we report the utilization of graphitebased catalysts for the electro-oxidation of hydrazine derivatives, using sodium chloride as a green and sustainable chemical approach. Graphite, a two-dimensional carbon material with exceptional properties, offers numerous advantages as a catalyst, including its high surface area, excellent electrical conductivity, and chemical stability. These characteristics make graphite an ideal candidate for promoting electrochemical reactions. Sodium chloride (NaCl), a readily available and cost-effective salt, serves as a green alternative to traditional oxidants used in hydrazine oxidation processes. By replacing conventional oxidizing agents with NaCl, we aim to reduce the environmental impact associated with the production and disposal of hazardous chemicals. This process enables the transformation of the HN-NH bond within hydrazines, leading to the formation of azo compounds (N¼N). Azo compounds are important organic molecules with diverse applications in organic synthesis. This novel approach has successfully showcased the efficacy of utilizing various azo compounds in 13 different examples, yielding excellent or moderate to good results. The method capitalizes on electricity as the final oxidizing agent, providing an environmentally friendly oxidation strategy. Its high efficiency and gentle reaction conditions make this technique valuable for synthesizing azo derivatives, even when working with hydrazines containing diverse functional groups, resulting in yields ranging from moderate to excellent. Through systematic experiments, we evaluated the catalytic performance of graphite-based catalysts in the electro-oxidation of hydrazine derivatives. The catalysts demonstrated remarkable catalytic activity due to their efficient conversion of hydrazine derivatives into desired products. Moreover, the system exhibited good stability and recyclability, suggesting its suitability for practical applications.
It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jackna
... Show MoreThe production of power using the process of pressure–retarded osmosis (PRO) has been studied both experimentally and theoretically for simulated sea water vs. river water and deionized water under two cases: the first is for simulated real conditions of sea water and river water and second under low brine solution concentration to examine the full profile of the power- pressure. The influence of concentration polarization (CP) on water flux has been examined as well.
The water quality index is the most common mathematical way of monitoring water characteristics due to the reasons for the water parameters to identify the type of water and the validity of its use, whether for drinking, agricultural, or industrial purposes. The water arithmetic indicator method was used to evaluate the drinking water of the Al-Muthana project, where the design capacity was (40000) m3/day, and it consists of traditional units used to treat raw water. Based on the water parameters (Turb, TDS, TH, SO4, NO2, NO3, Cl, Mg, and Ca), the evaluation results were that the quality of drinking water is within the second category of the requirements of the WHO (86.658%) and the first category of the standard has not been met du
... Show MoreLoanwords are the words transferred from one language to another, which become essential part of the borrowing language. The loanwords have come from the source language to the recipient language because of many reasons. Detecting these loanwords is complicated task due to that there are no standard specifications for transferring words between languages and hence low accuracy. This work tries to enhance this accuracy of detecting loanwords between Turkish and Arabic language as a case study. In this paper, the proposed system contributes to find all possible loanwords using any set of characters either alphabetically or randomly arranged. Then, it processes the distortion in the pronunciation, and solves the problem of the missing lette
... Show MoreDigital image manipulation has become increasingly prevalent due to the widespread availability of sophisticated image editing tools. In copy-move forgery, a portion of an image is copied and pasted into another area within the same image. The proposed methodology begins with extracting the image's Local Binary Pattern (LBP) algorithm features. Two main statistical functions, Stander Deviation (STD) and Angler Second Moment (ASM), are computed for each LBP feature, capturing additional statistical information about the local textures. Next, a multi-level LBP feature selection is applied to select the most relevant features. This process involves performing LBP computation at multiple scales or levels, capturing textures at different
... Show MoreThe support vector machine, also known as SVM, is a type of supervised learning model that can be used for classification or regression depending on the datasets. SVM is used to classify data points by determining the best hyperplane between two or more groups. Working with enormous datasets, on the other hand, might result in a variety of issues, including inefficient accuracy and time-consuming. SVM was updated in this research by applying some non-linear kernel transformations, which are: linear, polynomial, radial basis, and multi-layer kernels. The non-linear SVM classification model was illustrated and summarized in an algorithm using kernel tricks. The proposed method was examined using three simulation datasets with different sample
... Show More|
Ground Penetrating Radar (GPR) is a nondestructive geophysical technique that uses electromagnetic waves to evaluate subsurface information. A GPR unit emits a short pulse of electromagnetic energy and is able to determine the presence or absence of a target by examining the reflected energy from that pulse. GPR is geophysical approach that use band of the radio spectrum. In this research the function of GPR has been summarized as survey different buried objects such as (Iron, Plastic(PVC), Aluminum) in specified depth about (0.5m) using antenna of 250 MHZ, the response of the each object can be recognized as its shapes, this recognition have been performed using image processi |
Wildfire risk has globally increased during the past few years due to several factors. An efficient and fast response to wildfires is extremely important to reduce the damaging effect on humans and wildlife. This work introduces a methodology for designing an efficient machine learning system to detect wildfires using satellite imagery. A convolutional neural network (CNN) model is optimized to reduce the required computational resources. Due to the limitations of images containing fire and seasonal variations, an image augmentation process is used to develop adequate training samples for the change in the forest’s visual features and the seasonal wind direction at the study area during the fire season. The selected CNN model (Mob
... Show MoreMany authors investigated the problem of the early visibility of the new crescent moon after the conjunction and proposed many criteria addressing this issue in the literature. This article presented a proposed criterion for early crescent moon sighting based on a deep-learned pattern recognizer artificial neural network (ANN) performance. Moon sight datasets were collected from various sources and used to learn the ANN. The new criterion relied on the crescent width and the arc of vision from the edge of the crescent bright limb. The result of that criterion was a control value indicating the moon's visibility condition, which separated the datasets into four regions: invisible, telescope only, probably visible, and certai
... Show More