Sphingolipids are key components of eukaryotic membranes, particularly the plasma membrane. The biosynthetic pathway for the formation of these lipid species is largely conserved. However, in contrast to mammals, which produce sphingomyelin, organisms such as the pathogenic fungi and protozoa synthesize inositol phosphorylceramide (IPC) as the primary phosphosphingolipid. The key step involves the reaction of ceramide and phosphatidylinositol catalysed by IPC synthase, an essential enzyme with no mammalian equivalent encoded by the AUR1 gene in yeast and recently identified functional orthologues in the pathogenic kinetoplastid protozoa. As such this enzyme represents a promising target for novel anti-fungal and anti-protozoal drugs. Given the paucity of effective treatments for kinetoplastid diseases such as leishmaniasis, there is a need to characterize the protozoan enzyme. To this end a fluorescent-based cell-free assay protocol in a 96-well plate format has been established for the Leishmania major IPC synthase. Using this system the kinetic parameters of the enzyme have been determined as obeying the double displacement model with apparent V(max)=2.31 pmol min(-1)U(-1). Furthermore, inhibitory substrate analogues have been identified. Importantly this assay is amenable to development for use in high-throughput screening applications for lead inhibitors and as such may prove to be a pivotal tool in drug discovery.
This research deals with the qualitative and quantitative interpretation of Bouguer gravity anomaly data for a region located to the SW of Qa’im City within Anbar province by using 2D- mapping methods. The gravity residual field obtained graphically by subtracting the Regional Gravity values from the values of the total Bouguer anomaly. The residual gravity field processed in order to reduce noise by applying the gradient operator and 1st directional derivatives filtering. This was helpful in assigning the locations of sudden variation in Gravity values. Such variations may be produced by subsurface faults, fractures, cavities or subsurface facies lateral variations limits. A major fault was predicted to extend with the direction NE-
... Show MoreThe use of composite materials has vastly increased in recent years. Great interest is therefore developed in the damage detection of composites using non- destructive test methods. Several approaches have been applied to obtain information about the existence and location of the faults. This paper used the vibration response of a composite plate to detect and localize delamination defect based on the modal analysis. Experiments are conducted to validate the developed model. A two-dimensional finite element model for multi-layered composites with internal delamination is established. FEM program are built for plates under different boundary conditions. Natural frequencies and modal displacements of the intact and damaged
... Show MorePattern matching algorithms are usually used as detecting process in intrusion detection system. The efficiency of these algorithms is affected by the performance of the intrusion detection system which reflects the requirement of a new investigation in this field. Four matching algorithms and a combined of two algorithms, for intrusion detection system based on new DNA encoding, are applied for evaluation of their achievements. These algorithms are Brute-force algorithm, Boyer-Moore algorithm, Horspool algorithm, Knuth-Morris-Pratt algorithm, and the combined of Boyer-Moore algorithm and Knuth–Morris– Pratt algorithm. The performance of the proposed approach is calculated based on the executed time, where these algorithms are applied o
... Show MoreIntrusion detection systems detect attacks inside computers and networks, where the detection of the attacks must be in fast time and high rate. Various methods proposed achieved high detection rate, this was done either by improving the algorithm or hybridizing with another algorithm. However, they are suffering from the time, especially after the improvement of the algorithm and dealing with large traffic data. On the other hand, past researches have been successfully applied to the DNA sequences detection approaches for intrusion detection system; the achieved detection rate results were very low, on other hand, the processing time was fast. Also, feature selection used to reduce the computation and complexity lead to speed up the system
... Show MoreSpeech is the first invented way of communication that human used age before the invention of writing. In this paper, proposed method for speech analyses to extract features by using multiwavelet Transform (Repeated Row Preprocessing).The proposed system depends on the Euclidian differences of the coefficients of the multiwavelet Transform to determine the beast features of speech recognition. Each sample value in the reference file is computed by taking the average value of four samples for the same data (four speakers for the same phoneme). The result of the input data to every frame value in the reference file using the Euclidian distance to determine the frame with the minimum distance is said to be the "Best Match". Simulatio
... Show MoreA Bayesian formulation of the ridge regression problem is considerd, which derives from a direct specification of prior informations about parameters of general linear regression model when data suffer from a high degree of multicollinearity.A new approach for deriving the conventional estimator for the ridge parameter proposed by Hoerl and Kennard (1970) as well as Bayesian estimator are presented. A numerical example is studied in order to compare the performance of these estimators.
Medicine is one of the fields where the advancement of computer science is making significant progress. Some diseases require an immediate diagnosis in order to improve patient outcomes. The usage of computers in medicine improves precision and accelerates data processing and diagnosis. In order to categorize biological images, hybrid machine learning, a combination of various deep learning approaches, was utilized, and a meta-heuristic algorithm was provided in this research. In addition, two different medical datasets were introduced, one covering the magnetic resonance imaging (MRI) of brain tumors and the other dealing with chest X-rays (CXRs) of COVID-19. These datasets were introduced to the combination network that contained deep lea
... Show MoreAs cities across the world grow and the mobility of populations increases, there has also been a corresponding increase in the number of vehicles on roads. The result of this has been a proliferation of challenges for authorities with regard to road traffic management. A consequence of this has been congestion of traffic, more accidents, and pollution. Accidents are a still major cause of death, despite the development of sophisticated systems for traffic management and other technologies linked with vehicles. Hence, it is necessary that a common system for accident management is developed. For instance, traffic congestion in most urban areas can be alleviated by the real-time planning of routes. However, the designing of an efficie
... Show More