In this study a DFT calculation on cyclopropanone, cyclopropandione and cyclopropantrione molecules was performed using the basis function 6-31G ** / MP2 and exchange correlation potential B3-LYP. The results showed that the ground state of all molecules geometry belong to the point group ð¶2ð‘£where a vibronic coupling between the vibrational motion with the electronic ground state in the molecule C3O3 this leads to a reduction in symmetry of the molecule fromð·3â„Žto ð¶2ð‘£, the driving force of this process is accessing to the electronic configuration complies with Hückel aromatic systems with two electrons. Also in this, study the normal modes of vibration, frequencies, intensities and symmetry species. Finally the strain angular energy of molecules was calculated in a manner of isodesmic reaction
Seawater might serve as a fresh‐water supply for future generations to help meet the growing need for clean drinking water. Desalination and waste management using newer and more energy intensive processes are not viable options in the long term. Thus, an integrated and sustainable strategy is required to accomplish cost‐effective desalination via wastewater treatment. A microbial desalination cell (MDC) is a new technology that can treat wastewater, desalinate saltwater, and produce green energy simultaneously. Bio‐electrochemical oxidation of wastewater organics creates power using this method. Desalination and the creation of value‐added by‐products are expected because of this ionic mov
Abstract
The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.
... Show More
In this paper, two of the local search algorithms are used (genetic algorithm and particle swarm optimization), in scheduling number of products (n jobs) on a single machine to minimize a multi-objective function which is denoted as (total completion time, total tardiness, total earliness and the total late work). A branch and bound (BAB) method is used for comparing the results for (n) jobs starting from (5-18). The results show that the two algorithms have found the optimal and near optimal solutions in an appropriate times.
There is an evidence that channel estimation in communication systems plays a crucial issue in recovering the transmitted data. In recent years, there has been an increasing interest to solve problems due to channel estimation and equalization especially when the channel impulse response is fast time varying Rician fading distribution that means channel impulse response change rapidly. Therefore, there must be an optimal channel estimation and equalization to recover transmitted data. However. this paper attempt to compare epsilon normalized least mean square (ε-NLMS) and recursive least squares (RLS) algorithms by computing their performance ability to track multiple fast time varying Rician fading channel with different values of Doppler
... Show MoreImmune-mediated hepatitis is a severe impendence to human health, and no effective treatment is currently available. Therefore, new, safe, low-cost therapies are desperately required. Berbamine (BE), a natural substance obtained primarily from
This c
Tanuma and Zubair formations are known as the most problematic intervals in Zubair Oilfield, and they cause wellbore instability due to possible shale-fluid interaction. It causes a vast loss of time dealing with various downhole problems (e.g., stuck pipe) which leads to an increase in overall well cost for the consequences (e.g., fishing and sidetrack). This paper aims to test shale samples with various laboratory tests for shale evaluation and drilling muds development. Shale's physical properties are described by using a stereomicroscope and the structures are observed with Scanning Electron Microscope. The shale reactivity and behavior are analyzed by using the cation exchange capacity testing and the capillary suction test is
... Show MoreThe evolution of the Internet of things (IoT) led to connect billions of heterogeneous physical devices together to improve the quality of human life by collecting data from their environment. However, there is a need to store huge data in big storage and high computational capabilities. Cloud computing can be used to store big data. The data of IoT devices is transferred using two types of protocols: Message Queuing Telemetry Transport (MQTT) and Hypertext Transfer Protocol (HTTP). This paper aims to make a high performance and more reliable system through efficient use of resources. Thus, load balancing in cloud computing is used to dynamically distribute the workload across nodes to avoid overloading any individual r
... Show MoreHTH Ahmed Dheyaa Al-Obaidi,", Ali Tarik Abdulwahid', Mustafa Najah Al-Obaidi", Abeer Mundher Ali', eNeurologicalSci, 2023
This study dealt with the reality of groundwater before Sulfur production in the Al-Mishraq field-1 and after production stopped in the field, by measuring the groundwater table for (44) wells in 2021, and comparing it to the groundwater table measured by the Polish company Centrozap in 1971, the groundwater table was a range between (187.71-205.8)m in 1971, but in 2021 it ranged was between (188.51-196.55)m.
Maps of the groundwater movement and water table were created using these data. It turned out that there was little change in the direction of groundwater flow; in both cases, the flow is from the west and northwest towards the east with a slight slope toward the southeast and the Tigris River. As for hydraulic properties, it w
Data scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for