This study's objective is to assess how well UV spectrophotometry can be used in conjunction with multivariate calibration based on partial least squares (PLS) regression for concurrent quantitative analysis of antibacterial mixture (Levofloxacin (LIV), Metronidazole (MET), Rifampicin (RIF) and Sulfamethoxazole (SUL)) in their artificial mixtures and pharmaceutical formulations. The experimental calibration and validation matrixes were created using 42 and 39 samples, respectively. The concentration range taken into account was 0-17 μg/mL for all components. The calibration standards' absorbance measurements were made between 210 and 350 nm, with intervals of 0.2 nm. The associated parameters were examined in order to develop the optimal calibration model. The cross-validation method was used to determine the ideal number of components. The coefficient of determination (R2) and the root mean square error of calibration (RMSEC) are used to evaluate the calibration model. The relation between the LEV, MET, RIF, and SUL actual values and predicted values had a coefficient of determination that was higher than 0.997, showing very good accuracy of the devised approach. The obtained RMSEC values, 0.181056465 (LEV), 0.180375418 (MET), 0.142767171 (RIF), and 0.17157454 (SUL), show an analytical procedure with adequate precision. The suggested technique for quantitative analysis of the quaternary mixture of LEV, MET, RIF, and SUL have been applied successfully in different pharmaceutical preparations. The UV spectrophotometry assisted with chemometric-PLS without prior treatment, be utilised to resolve multicomponent mixtures successfully.
Some nonlinear differential equations with fractional order are evaluated using a novel approach, the Sumudu and Adomian Decomposition Technique (STADM). To get the results of the given model, the Sumudu transformation and iterative technique are employed. The suggested method has an advantage over alternative strategies in that it does not require additional resources or calculations. This approach works well, is easy to use, and yields good results. Besides, the solution graphs are plotted using MATLAB software. Also, the true solution of the fractional Newell-Whitehead equation is shown together with the approximate solutions of STADM. The results showed our approach is a great, reliable, and easy method to deal with specific problems in
... Show MoreIn the field of data security, the critical challenge of preserving sensitive information during its transmission through public channels takes centre stage. Steganography, a method employed to conceal data within various carrier objects such as text, can be proposed to address these security challenges. Text, owing to its extensive usage and constrained bandwidth, stands out as an optimal medium for this purpose. Despite the richness of the Arabic language in its linguistic features, only a small number of studies have explored Arabic text steganography. Arabic text, characterized by its distinctive script and linguistic features, has gained notable attention as a promising domain for steganographic ventures. Arabic text steganography harn
... Show MoreDeep learning convolution neural network has been widely used to recognize or classify voice. Various techniques have been used together with convolution neural network to prepare voice data before the training process in developing the classification model. However, not all model can produce good classification accuracy as there are many types of voice or speech. Classification of Arabic alphabet pronunciation is a one of the types of voice and accurate pronunciation is required in the learning of the Qur’an reading. Thus, the technique to process the pronunciation and training of the processed data requires specific approach. To overcome this issue, a method based on padding and deep learning convolution neural network is proposed to
... Show MoreWeb testing is very important method for users and developers because it gives the ability to detect errors in applications and check their quality to perform services to users performance abilities, user interface, security and other different types of web testing that may occur in web application. This paper focuses on a major branch of the performance testing, which is called the load testing. Load testing depends on an important elements called request time and response time. From these elements, it can be decided if the performance time of a web application is good or not. In the experimental results, the load testing applied on the website (http://ihcoedu.uobaghdad.edu.iq) the main home page and all the science departments pages. In t
... Show MoreIn this paper, we will provide a proposed method to estimate missing values for the Explanatory variables for Non-Parametric Multiple Regression Model and compare it with the Imputation Arithmetic mean Method, The basis of the idea of this method was based on how to employ the causal relationship between the variables in finding an efficient estimate of the missing value, we rely on the use of the Kernel estimate by Nadaraya – Watson Estimator , and on Least Squared Cross Validation (LSCV) to estimate the Bandwidth, and we use the simulation study to compare between the two methods.
In this study, mean free path and positron elastic-inelastic scattering are modeled for the elements hydrogen (H), carbon (C), nitrogen (N), oxygen (O), phosphorus (P), sulfur (S), chlorine (Cl), potassium (K) and iodine (I). Despite the enormous amounts of data required, the Monte Carlo (MC) method was applied, allowing for a very accurate simulation of positron interaction collisions in live cells. Here, the MC simulation of the interaction of positrons was reported with breast, liver, and thyroid at normal incidence angles, with energies ranging from 45 eV to 0.2 MeV. The model provides a straightforward analytic formula for the random sampling of positron scattering. ICRU44 was used to compile the elemental composition data. In this
... Show MoreToday in the digital realm, where images constitute the massive resource of the social media base but unfortunately suffer from two issues of size and transmission, compression is the ideal solution. Pixel base techniques are one of the modern spatially optimized modeling techniques of deterministic and probabilistic bases that imply mean, index, and residual. This paper introduces adaptive pixel-based coding techniques for the probabilistic part of a lossy scheme by incorporating the MMSA of the C321 base along with the utilization of the deterministic part losslessly. The tested results achieved higher size reduction performance compared to the traditional pixel-based techniques and the standard JPEG by about 40% and 50%,
... Show More