The development of a meaningful dissolution procedure for drug products with limited water solubility has been a challenge to both the pharmaceutical industry and the agencies that regulate them. Natural surfactants aid in the dissolution and subsequent absorption of drugs with limited aqueous solubility. In vitro, various techniques have been used to achieve adequate dissolution of the sparingly water – soluble or water insoluble drug products such as the use of mechanical methods (i.e., increased agitation and the disintegration method) or hydro alcoholic medium or large volumes of medium. The necessity of assuring the quality of drugs , especially those with low aqueous solubility and in vivo absorption , has led to the development and evaluation of new techniques that can reduce the time and cost of analysis. This study has been examines the efficiency and accuracy of an automated dissolution system, fitted with a simple, integrated, for analysis of generic drugs. Sodium Selenite 200 ?g tablets was chosen as model drugs for this study and comparison was made with a conventional analysis based on flameless atomic absorption spectrophotometer (AAS). The analytical system under study gave reproducible and accurate results. Low instrumentation cost was demonstrated which is provide satisfactory elemental drugs analysis to a standard at least as good as that achieved using AAS.
In this research velocity of moving airplane from its recorded digital sound is introduced. The data of sound file is sliced into several frames using overlapping partitions. Then the array of each frame is transformed from time domain to frequency domain using Fourier Transform (FT). To determine the characteristic frequency of the sound, a moving window mechanics is used, the size of that window is made linearly proportional with the value of the tracked frequency. This proportionality is due to the existing linear relationship between the frequency and its Doppler shift. An algorithm was introduced to select the characteristic frequencies, this algorithm allocates the frequencies which satisfy the Doppler relation, beside that the tra
... Show MoreThis study proposed control system that has been presented to control the electron lens resistance in order to obtain a stabilized electron lens power. This study will layout the fundamental challenges, hypothetical plan arrangements and development condition for the Integrable Optics Test Accelerator (IOTA) in progress at Fermilab. Thus, an effective automatic gain control (AGC) unit has been introduced which prevents fluctuations in the internal resistance of the electronic lens caused by environmental influences to affect the system's current and power values and keep them in stable amounts. Utilizing this unit has obtained level balanced out system un impacted with electronic lens surrounding natural varieties.
In this paper, suggested formula as well a conventional method for estimating the twoparameters (shape and scale) of the Generalized Rayleigh Distribution was proposed. For different sample sizes (small, medium, and large) and assumed several contrasts for the two parameters a percentile estimator was been used. Mean Square Error was implemented as an indicator of performance and comparisons of the performance have been carried out through data analysis and computer simulation between the suggested formulas versus the studied formula according to the applied indicator. It was observed from the results that the suggested method which was performed for the first time (as far as we know), had highly advantage than t
... Show MoreThis investigation proposed an identification system of offline signature by utilizing rotation compensation depending on the features that were saved in the database. The proposed system contains five principle stages, they are: (1) data acquisition, (2) signature data file loading, (3) signature preprocessing, (4) feature extraction, and (5) feature matching. The feature extraction includes determination of the center point coordinates, and the angle for rotation compensation (θ), implementation of rotation compensation, determination of discriminating features and statistical condition. During this work seven essential collections of features are utilized to acquire the characteristics: (i) density (D), (ii) average (A), (iii) s
... Show MoreIn this paper, experimental study has been done for temperature distribution in space conditioned with Ventilation Hollow Core Slab (TermoDeck) system. The experiments were carried out on a model room with dimensions of (1m 1.2m 1m) that was built according to a suitable scale factor of (1/4). The temperature distributions was measured by 59 thermocouples fixed in several locations in the test room. Two cases were considered in this work, the first one during unoccupied period at night time (without external load) and the other at day period with external load of 800W/m2 according to solar heat gain calculations during summer season in Iraq. All results confirm the use of TermoDeck system for ventilation and cooling/heat
... Show MoreLongitudinal data is becoming increasingly common, especially in the medical and economic fields, and various methods have been analyzed and developed to analyze this type of data.
In this research, the focus was on compiling and analyzing this data, as cluster analysis plays an important role in identifying and grouping co-expressed subfiles over time and employing them on the nonparametric smoothing cubic B-spline model, which is characterized by providing continuous first and second derivatives, resulting in a smoother curve with fewer abrupt changes in slope. It is also more flexible and can pick up on more complex patterns and fluctuations in the data.
The longitudinal balanced data profile was compiled into subgroup
... Show More
Abstract
The human mind knew the philosophy and logic in the ancient times, and the history afterwards, while the semiotics concept appeared in the modern time, and became a new knowledge field like the other knowledge fields. It deals, in its different concepts and references, with the processes that lead to and reveals the meaning through what is hidden in addition to what is disclosed. It is the result of human activity in its pragmatic and cognitive dimensions together. The semiotic token concept became a knowledge key to access all the study, research, and investigation fields, due to its ability of description, explanation, and dismantling. The paper is divided into two sections preceded by a the
... Show More