The gas chromatography (GC) method in analytical chemistry is a quick and accurate method to detect volatile components like ethanol. A method for determining volatile components known as Headspace chromatography (HS-GC) was developed along with an internal standard method (ISM) to identify ethanol in fermented broth in the laboratory. The aim of this research is determining the concentration of ethanol in fermented broth using capillary column (ZB-1). This method can analyze ethanol concentrations in the fermented medium broth ranging from 10 to 200 g/L. The validation of this method was done in order to obtain the results to be of high precision and the significant, precision was represented as the relative standard deviation (RSD) which was less than 5%, accuracy was less than 4 % and significance level was p [ 0.05. It was found that this method exhibited good reproducibility.
In recent years, literary studies have witnessed a remarkable shift towards employing digital technologies, particularly artificial intelligence tools, in analyzing literary texts and exploring their linguistic and semantic structures. This trend has provided researchers with new possibilities for understanding texts in quantitative and qualitative ways that transcend traditional methods based solely on critical reading. The current research aims to introduce professors and students of Arabic to artificial intelligence tools that contribute to the analysis of literary texts, focusing on exploring their mechanisms for studying style, meaning, structure, and emotion. It also seeks to highlight the most prominent challenges facing researchers
... Show MoreIn this paper, the problem of resource allocation at Al-Raji Company for soft drinks and juices was studied. The company produces several types of tasks to produce juices and soft drinks, which need machines to accomplish these tasks, as it has 6 machines that want to allocate to 4 different tasks to accomplish these tasks. The machines assigned to each task are subject to failure, as these machines are repaired to participate again in the production process. From past records of the company, the probability of failure machines at each task was calculated depending on company data information. Also, the time required for each machine to complete each task was recorded. The aim of this paper is to determine the minimum expected ti
... Show MoreSurvival analysis is widely applied in data describing for the life time of item until the occurrence of an event of interest such as death or another event of understudy . The purpose of this paper is to use the dynamic approach in the deep learning neural network method, where in this method a dynamic neural network that suits the nature of discrete survival data and time varying effect. This neural network is based on the Levenberg-Marquardt (L-M) algorithm in training, and the method is called Proposed Dynamic Artificial Neural Network (PDANN). Then a comparison was made with another method that depends entirely on the Bayes methodology is called Maximum A Posterior (MAP) method. This method was carried out using numerical algorithms re
... Show More
Abstract
The use of modern scientific methods and techniques, is considered important topics to solve many of the problems which face some sector, including industrial, service and health. The researcher always intends to use modern methods characterized by accuracy, clarity and speed to reach the optimal solution and be easy at the same time in terms of understanding and application.
the research presented this comparison between the two methods of solution for linear fractional programming models which are linear transformation for Charnas & Cooper , and denominator function restriction method through applied on the oil heaters and gas cookers plant , where the show after reac
... Show MoreThe 3-parameter Weibull distribution is used as a model for failure since this distribution is proper when the failure rate somewhat high in starting operation and these rates will be decreased with increasing time .
In practical side a comparison was made between (Shrinkage and Maximum likelihood) Estimators for parameter and reliability function using simulation , we conclude that the Shrinkage estimators for parameters are better than maximum likelihood estimators but the maximum likelihood estimator for reliability function is the better using statistical measures (MAPE)and (MSE) and for different sample sizes.
Note:- ns : small sample ; nm=median sample
... Show MoreThe charge density distributions of 10 B nucleus are calculated using the
harmonic oscillator wave functions. Elastic and inelastic electron scattering
longitudinal form factors have been calculated for the similar parity states of 10B
nucleus where a core of 4He is assumed and the remaining particles are
distributed over 3/ 2 1p and 1/ 2 1p orbits which form the model space.
Core-polarization effects are taken into account. Core-polarization effects are
calculated by using Tassie model and gives good agreement with the measured
data.