PvcABCD are cluster of genes found in Pseudomonas aeruginosa. The research was designed to examine the relationship between the pvc genes expression and cupB gene, which plays a crucial role in the development of biofilm, and rhlR, which regulates the expression of biofilm-related genes, and to investigate whether the pvc genes form one or two operons. The aims were achieved by employing qRT-PCR technique to measure the gene expression of genes of interest. It was found that out of 25 clinical isolates, 21 isolates were qualified as P.aeruginosa. Amongst, 18(85.7%) were evaluated as biofilm producers, 10 (47.6%), 5 (23.8%), and 3 (14.2%) were evaluated as strong, moderate and weak producers respectively, while, 3 (14.2%) were considered as a non-biofilm forming isolate. The pvcA and pvcB were shown to be over-expressed (>2) fold in all biofilm-producer isolates, similar to that observed in cupB and rhlR, while pvcC and pvcD showed to be down-regulated (<0.5) fold in these isolates. These findings imply that the pvc genes are organized into two operons, pvcAB, and pvcCD, and genes involved in biofilm formation are regulated by pvcAB operon. This is the first study in Iraq to investigate these genes.
Artificial Intelligence Algorithms have been used in recent years in many scientific fields. We suggest employing flower pollination algorithm in the environmental field to find the best estimate of the semi-parametric regression function with measurement errors in the explanatory variables and the dependent variable, where measurement errors appear frequently in fields such as chemistry, biological sciences, medicine, and epidemiological studies, rather than an exact measurement. We estimate the regression function of the semi-parametric model by estimating the parametric model and estimating the non-parametric model, the parametric model is estimated by using an instrumental variables method (Wald method, Bartlett’s method, and Durbin
... Show MoreRecent studies have revealed some conflicting results about the health effects of caffeine. These studies are inconsistent in terms of design and population and source of consumed caffeine. In the current study, we aimed to evaluate the possible health effects of dietary caffeine intake among overweight and obese individuals.
In this cross-sectional study, 488 apparently healthy individuals with overweight and obesity were participated. Dietary intake was assessed by a Food Frequency Questionnaire (FFQ) and
Most studies on deep beams have been made with reinforced concrete deep beams, only a few studies investigate the response of prestressed deep beams, while, to the best of our knowledge, there is not a study that investigates the response of full scale (T-section) prestressed deep beams with large web openings. An experimental and numerical study was conducted in order to investigate the shear strength of ordinary reinforced and partially prestressed full scale (T-section) deep beams that contain large web openings in order to investigate the prestressing existence effects on the deep beam responses and to better understand the effects of prestressing locations and opening depth to beam depth ratio on the deep beam performance and b
... Show MoreThe electrocardiogram (ECG) is the recording of the electrical potential of the heart versus time. The analysis of ECG signals has been widely used in cardiac pathology to detect heart disease. The ECGs are non-stationary signals which are often contaminated by different types of noises from different sources. In this study, simulated noise models were proposed for the power-line interference (PLI), electromyogram (EMG) noise, base line wander (BW), white Gaussian noise (WGN) and composite noise. For suppressing noises and extracting the efficient morphology of an ECG signal, various processing techniques have been recently proposed. In this paper, wavelet transform (WT) is performed for noisy ECG signals. The graphical user interface (GUI)
... Show MoreThe prediction process of time series for some time-related phenomena, in particular, the autoregressive integrated moving average(ARIMA) models is one of the important topics in the theory of time series analysis in the applied statistics. Perhaps its importance lies in the basic stages in analyzing of the structure or modeling and the conditions that must be provided in the stochastic process. This paper deals with two methods of predicting the first was a special case of autoregressive integrated moving average which is ARIMA (0,1,1) if the value of the parameter equal to zero, then it is called Random Walk model, the second was the exponential weighted moving average (EWMA). It was implemented in the data of the monthly traff
... Show MoreHTH Ahmed Dheyaa Al-Obaidi,", Ali Tarik Abdulwahid', Mustafa Najah Al-Obaidi", Abeer Mundher Ali', eNeurologicalSci, 2023
Image segmentation using bi-level thresholds works well for straightforward scenarios; however, dealing with complex images that contain multiple objects or colors presents considerable computational difficulties. Multi-level thresholding is crucial for these situations, but it also introduces a challenging optimization problem. This paper presents an improved Reptile Search Algorithm (RSA) that includes a Gbest operator to enhance its performance. The proposed method determines optimal threshold values for both grayscale and color images, utilizing entropy-based objective functions derived from the Otsu and Kapur techniques. Experiments were carried out on 16 benchmark images, which inclu
In information security, fingerprint verification is one of the most common recent approaches for verifying human identity through a distinctive pattern. The verification process works by comparing a pair of fingerprint templates and identifying the similarity/matching among them. Several research studies have utilized different techniques for the matching process such as fuzzy vault and image filtering approaches. Yet, these approaches are still suffering from the imprecise articulation of the biometrics’ interesting patterns. The emergence of deep learning architectures such as the Convolutional Neural Network (CNN) has been extensively used for image processing and object detection tasks and showed an outstanding performance compare
... Show MoreData scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for