Red pigmented undecylprodigiosin produced by Streptomyces coelicolor (A3)2 is a
promising drug owing to its characteristics of antibacterial, antifungal,
immunosuppressive and anticancer activities. The culture of S. coelicolor in liquid
medium produces mainly the blue pigmented actinorhodin and only low quantities of
undecylprodigiosin. From an industrial point of view, it is necessary to find a strategy to
improve undecylprodigiosin production. The present study provides evidence that
cultivation of S. coelicolor on solid substrate resulted in a reversal in this pattern of
antibiotic production as the production of undecylprodigiosin was significantly increased
and actinorhodin was completely suppressed. Four different solid substrate (wheat bran,
soya bean ground, rice husk and ground corn) were tested for their ability to support the
maximal production of undecylprodigiosin in solid state fermentation. Wheat bran
showed the highest production of undecylprodigiosin, starting from the first day of
incubation at a moisture level of (1:1 weight: volume) and reaching its maximum of 16
mg/gds on the fourth day. In addition, we report the exploitation of the interspecies
interaction in order to enhance undecylprodigiosin production by introducing live or
dead cells of E. coli, Bacillus subtilis and Saccharomyces cerevisiae, separately, to
Streptomyces coelicolor solid substrate fermentation. Our results revealed a significant
increase in the production of undecylprodigiosin in the elicited cultures compared with
control. The maximum enhancement occurred in the culture elicited with the live cells of
B. subtilis with an increase of 2-fold compared with control.
Machine learning has a significant advantage for many difficulties in the oil and gas industry, especially when it comes to resolving complex challenges in reservoir characterization. Permeability is one of the most difficult petrophysical parameters to predict using conventional logging techniques. Clarifications of the work flow methodology are presented alongside comprehensive models in this study. The purpose of this study is to provide a more robust technique for predicting permeability; previous studies on the Bazirgan field have attempted to do so, but their estimates have been vague, and the methods they give are obsolete and do not make any concessions to the real or rigid in order to solve the permeability computation. To
... Show MoreCryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms
... Show MoreIn this research paper, a new blind and robust fingerprint image watermarking scheme based on a combination of dual-tree complex wavelet transform (DTCWT) and discrete cosine transform (DCT) domains is demonstrated. The major concern is to afford a solution in reducing the consequence of geometric attacks. It is due to the fingerprint features that may be impacted by the incorporated watermark, fingerprint rotations, and displacements that result in multiple feature sets. To integrate the bits of the watermark sequence into a differential process, two DCT-transformed sub-vectors are implemented. The initial sub-vectors were obtained by sub-sampling in the host fingerprint image of both real and imaginary parts of the DTCWT wavelet coeffi
... Show MoreDatabase is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show MoreHuman beings are greatly inspired by nature. Nature has the ability to solve very complex problems in its own distinctive way. The problems around us are becoming more and more complex in the real time and at the same instance our mother nature is guiding us to solve these natural problems. Nature gives some of the logical and effective ways to find solutions to these problems. Nature acts as an optimized source for solving the complex problems. Decomposition is a basic strategy in traditional multi-objective optimization. However, it has not yet been widely used in multi-objective evolutionary optimization.
Although computational strategies for taking care of Multi-objective Optimization Problems (MOPs) h
... Show MoreThis paper describes a newly modified wind turbine ventilator that can achieve highly efficient ventilation. The new modification on the conventional wind turbine ventilator system may be achieved by adding a Savonius wind turbine above the conventional turbine to make it work more efficiently and help spinning faster. Three models of the Savonius wind turbine with 2, 3, and 4 blades' semicircular arcs are proposed to be placed above the conventional turbine of wind ventilator to build a hybrid ventilation turbine. A prototype of room model has been constructed and the hybrid turbine is placed on the head of the room roof. Performance's tests for the hybrid turbine with a different number of blades and different values o
... Show MoreThis work implements the face recognition system based on two stages, the first stage is feature extraction stage and the second stage is the classification stage. The feature extraction stage consists of Self-Organizing Maps (SOM) in a hierarchical format in conjunction with Gabor Filters and local image sampling. Different types of SOM’s were used and a comparison between the results from these SOM’s was given.
The next stage is the classification stage, and consists of self-organizing map neural network; the goal of this stage is to find the similar image to the input image. The proposal method algorithm implemented by using C++ packages, this work is successful classifier for a face database consist of 20
... Show MoreThe paired sample t-test is a type of classical test statistics that is used to test the difference between two means in paired data, but it is not robust against the violation of the normality assumption. In this paper, some alternative robust tests are suggested by combining the Jackknife resampling with each of the Wilcoxon signed-rank test for small sample size and Wilcoxon signed-rank test for large sample size, using normal approximation. The Monte Carlo simulation experiments were employed to study the performance of the test statistics of each of these tests depending on the type one error rates and the power rates of the test statistics. All these tests were applied on different sa
... Show MoreIn this paper, we have employed a computation of three technique to reduce the computational complexity and bit rate for compressed image. These techniques are bit plane coding based on two absolute values, vector quantization VQ technique using Cache codebook and Weber's low condition. The experimental results show that the proposed techniques achieve reduce the storage size of bit plane and low computational complexity.