Sphingolipids are key components of eukaryotic membranes, particularly the plasma membrane. The biosynthetic pathway for the formation of these lipid species is largely conserved. However, in contrast to mammals, which produce sphingomyelin, organisms such as the pathogenic fungi and protozoa synthesize inositol phosphorylceramide (IPC) as the primary phosphosphingolipid. The key step involves the reaction of ceramide and phosphatidylinositol catalysed by IPC synthase, an essential enzyme with no mammalian equivalent encoded by the AUR1 gene in yeast and recently identified functional orthologues in the pathogenic kinetoplastid protozoa. As such this enzyme represents a promising target for novel anti-fungal and anti-protozoal drugs. Given the paucity of effective treatments for kinetoplastid diseases such as leishmaniasis, there is a need to characterize the protozoan enzyme. To this end a fluorescent-based cell-free assay protocol in a 96-well plate format has been established for the Leishmania major IPC synthase. Using this system the kinetic parameters of the enzyme have been determined as obeying the double displacement model with apparent V(max)=2.31 pmol min(-1)U(-1). Furthermore, inhibitory substrate analogues have been identified. Importantly this assay is amenable to development for use in high-throughput screening applications for lead inhibitors and as such may prove to be a pivotal tool in drug discovery.
This paper includes a comparison between denoising techniques by using statistical approach, principal component analysis with local pixel grouping (PCA-LPG), this procedure is iterated second time to further improve the denoising performance, and other enhancement filters were used. Like adaptive Wiener low pass-filter to a grayscale image that has been degraded by constant power additive noise, based on statistics estimated from a local neighborhood of each pixel. Performs Median filter of the input noisy image, each output pixel contains the Median value in the M-by-N neighborhood around the corresponding pixel in the input image, Gaussian low pass-filter and Order-statistic filter also be used.
Experimental results shows LPG-
... Show MoreThe Internet image retrieval is an interesting task that needs efforts from image processing and relationship structure analysis. In this paper, has been proposed compressed method when you need to send more than a photo via the internet based on image retrieval. First, face detection is implemented based on local binary patterns. The background is notice based on matching global self-similarities and compared it with the rest of the image backgrounds. The propose algorithm are link the gap between the present image indexing technology, developed in the pixel domain, and the fact that an increasing number of images stored on the computer are previously compressed by JPEG at the source. The similar images are found and send a few images inst
... Show MoreIn this paper, a design of the broadband thin metamaterial absorber (MMA) is presented. Compared with the previously reported metamaterial absorbers, the proposed structure provides a wide bandwidth with a compatible overall size. The designed absorber consists of a combination of octagon disk and split octagon resonator to provide a wide bandwidth over the Ku and K bands' frequency range. Cheap FR-4 material is chosen to be a substate of the proposed absorber with 1.6 thicknesses and 6.5×6.5 overall unit cell size. CST Studio Suite was used for the simulation of the proposed absorber. The proposed absorber provides a wide absorption bandwidth of 14.4 GHz over a frequency range of 12.8-27.5 GHz with more than %90 absorp
... Show MoreLuminescent sensor membranes and sensor microplates are presented for continuous or high-throughput wide-range measurement of pH based on a europium probe.
This paper includes a comparison between denoising techniques by using statistical approach, principal component analysis with local pixel grouping (PCA-LPG), this procedure is iterated second time to further improve the denoising performance, and other enhancement filters were used. Like adaptive Wiener low pass-filter to a grayscale image that has been degraded by constant power additive noise, based on statistics estimated from a local neighborhood of each pixel. Performs Median filter of the input noisy image, each output pixel contains the Median value in the M-by-N neighborhood around the corresponding pixel in the input image, Gaussian low pass-filter and Order-statistic filter also be used. Experimental results shows LPG-PCA method
... Show MoreCyber-attacks keep growing. Because of that, we need stronger ways to protect pictures. This paper talks about DGEN, a Dynamic Generative Encryption Network. It mixes Generative Adversarial Networks with a key system that can change with context. The method may potentially mean it can adjust itself when new threats appear, instead of a fixed lock like AES. It tries to block brute‑force, statistical tricks, or quantum attacks. The design adds randomness, uses learning, and makes keys that depend on each image. That should give very good security, some flexibility, and keep compute cost low. Tests still ran on several public image sets. Results show DGEN beats AES, chaos tricks, and other GAN ideas. Entropy reached 7.99 bits per pix
... Show MoreThe study aims to identify the impact of competency-based training in its dimensions (skills, cognitive abilities, attitudes, and attitudes) in improving the performance of employees (achievement, strategic thinking and problem solving) in Jordanian university hospitals.
The study based on analytical descriptive method. The study population consisted of the Jordanian University Hospitals, the University Hospital of Jordan and the King Abdullah Hospital, as applied study case. The sample of the study consists of all upper and middle administrative employees of these hospitals; questionnaire distributed all of them and the number of valid questionnaires for analysis were 182 questionnaire.
... Show MoreResearch on the automated extraction of essential data from an electrocardiography (ECG) recording has been a significant topic for a long time. The main focus of digital processing processes is to measure fiducial points that determine the beginning and end of the P, QRS, and T waves based on their waveform properties. The presence of unavoidable noise during ECG data collection and inherent physiological differences among individuals make it challenging to accurately identify these reference points, resulting in suboptimal performance. This is done through several primary stages that rely on the idea of preliminary processing of the ECG electrical signal through a set of steps (preparing raw data and converting them into files tha
... Show MoreThe research aims to know the effectiveness of a training program based on multiple intelligence theory in developing literary thinking among students of the Arabic Language Department at Ibn Rushd School of Humanities and to achieve the goal of research, the Safaris Research Institute, and the research community of Arabic language students in the Faculty of Education the third section of Arabic Language: The research sample consists of (71) students. Divided into (35) students in the experimental group and (36) students in the control group, the researcher balanced between the two groups with variables (intelligence, testing of tribal literary thinking, and time age in months), and after using the T-test for two independent samples, the
... Show More