The aim of the current study was to develop a nanostructured double-layer for hydrophobic molecules delivery system. The developed double-layer consisted of polyethylene glycol-based polymeric (PEG) followed by gelatin sub coating of the core hydrophobic molecules containing sodium citrate. The polymeric composition ratio of PEG and the amount of the sub coating gelatin were optimized using the two-level fractional method. The nanoparticles were characterized using AFM and FT-IR techniques. The size of these nano capsules was in the range of 39-76 nm depending on drug loading concentration. The drug was effectively loaded into PEG-Gelatin nanoparticles (≈47%). The hydrophobic molecules-release characteristics in terms of controlled-release duration and dissolution efficiency were examined in various dissolution media, such as physiological pH (7.4) and simulated stomach fluid (3.4). Consequently, the optimized double-layer for hydrophobic molecules delivery system showed a gradual release of hydrophobic molecules in the and in physiological pH, indicating its novelty for using as a platform for hydrophobic molecules delivery.
Correlation equations for expressing the boiling temperature as direct function of liquid composition have been tested successfully and applied for predicting azeotropic behavior of multicomponent mixtures and the kind of azeotrope (minimum, maximum and saddle type) using modified correlation of Gibbs-Konovalov theorem. Also, the binary and ternary azeotropic point have been detected experimentally using graphical determination on the basis of experimental binary and ternary vapor-liquid equilibrium data.
In this study, isobaric vapor-liquid equilibrium for two ternary systems: “1-Propanol – Hexane – Benzene” and its binaries “1-Propanol –
... Show MoreThe research aims to estimate missing values using covariance analysis method Coons way to the variable response or dependent variable that represents the main character studied in a type of multi-factor designs experiments called split block-design (SBED) so as to increase the accuracy of the analysis results and the accuracy of statistical tests based on this type of designs. as it was noted in the theoretical aspect to the design of dissident sectors and statistical analysis have to analyze the variation in the experience of experiment )SBED) and the use of covariance way coons analysis according to two methods to estimate the missing value, either in the practical side of it has been implemented field experiment wheat crop in
... Show MoreThe current research creates an overall relative analysis concerning the estimation of Meixner process parameters via the wavelet packet transform. Of noteworthy presentation relevance, it compares the moment method and the wavelet packet estimator for the four parameters of the Meixner process. In this paper, the research focuses on finding the best threshold value using the square root log and modified square root log methods with the wavelet packets in the presence of noise to enhance the efficiency and effectiveness of the denoising process for the financial asset market signal. In this regard, a simulation study compares the performance of moment estimation and wavelet packets for different sample sizes. The results show that wavelet p
... Show MoreMany approaches of different complexity already exist to edge detection in
color images. Nevertheless, the question remains of how different are the results
when employing computational costly techniques instead of simple ones. This
paper presents a comparative study on two approaches to color edge detection to
reduce noise in image. The approaches are based on the Sobel operator and the
Laplace operator. Furthermore, an efficient algorithm for implementing the two
operators is presented. The operators have been applied to real images. The results
are presented in this paper. It is shown that the quality of the results increases by
using second derivative operator (Laplace operator). And noise reduced in a good
In this research velocity of moving airplane from its recorded digital sound is introduced. The data of sound file is sliced into several frames using overlapping partitions. Then the array of each frame is transformed from time domain to frequency domain using Fourier Transform (FT). To determine the characteristic frequency of the sound, a moving window mechanics is used, the size of that window is made linearly proportional with the value of the tracked frequency. This proportionality is due to the existing linear relationship between the frequency and its Doppler shift. An algorithm was introduced to select the characteristic frequencies, this algorithm allocates the frequencies which satisfy the Doppler relation, beside that the tra
... Show MoreThis research includes the using of statistical to improve the quality of can plastics which is produced at the state company for Vegetable oils (Almaamon factory ) by using the percentage defective control chart ( p-chart ) of a fixed sample. A sample of size (450) cans daily for (30) days was selected to determine the rejected product . Operations research with a (win QSB ) package for ( p-chart ) was used to determine test quality level required for product specification to justify that the process that is statistically controlled.
The results show high degree of accuracy by using the program and the mathematical operations (primary and secondary ) which used to draw the control limits charts and to reject the statistically uncontr
Abstract
Metal cutting processes still represent the largest class of manufacturing operations. Turning is the most commonly employed material removal process. This research focuses on analysis of the thermal field of the oblique machining process. Finite element method (FEM) software DEFORM 3D V10.2 was used together with experimental work carried out using infrared image equipment, which include both hardware and software simulations. The thermal experiments are conducted with AA6063-T6, using different tool obliquity, cutting speeds and feed rates. The results show that the temperature relatively decreased when tool obliquity increases at different cutting speeds and feed rates, also it
... Show MoreThe Machine learning methods, which are one of the most important branches of promising artificial intelligence, have great importance in all sciences such as engineering, medical, and also recently involved widely in statistical sciences and its various branches, including analysis of survival, as it can be considered a new branch used to estimate the survival and was parallel with parametric, nonparametric and semi-parametric methods that are widely used to estimate survival in statistical research. In this paper, the estimate of survival based on medical images of patients with breast cancer who receive their treatment in Iraqi hospitals was discussed. Three algorithms for feature extraction were explained: The first principal compone
... Show MoreIn this research, the effect of reinforcing epoxy resin composites with a filler derived from chopped agriculture waste from oil palm (OP). Epoxy/OP composites were formed by dispersing (1, 3, 5, and 10 wt%) OP filler using a high-speed mechanical stirrer utilizing a hand lay-up method. The effect of adding zinc oxide (ZnO) nanoparticles, with an average size of 10-30 nm, with different wt% (1,2,3, and 5wt%) to the epoxy/oil palm composite, on the behavior of an epoxy/oil palm composite was studied with different ratios (1,2,3, and 5wt%) and an average size of 10-30 nm. Fourier Transform Infrared (FTIR) spectrometry and mechanical properties (tensile, impact, hardness, and wear rate) were used to examine the composites. The FTIR
... Show MoreIn this paper, point estimation for parameter ? of Maxwell-Boltzmann distribution has been investigated by using simulation technique, to estimate the parameter by two sections methods; the first section includes Non-Bayesian estimation methods, such as (Maximum Likelihood estimator method, and Moment estimator method), while the second section includes standard Bayesian estimation method, using two different priors (Inverse Chi-Square and Jeffrey) such as (standard Bayes estimator, and Bayes estimator based on Jeffrey's prior). Comparisons among these methods were made by employing mean square error measure. Simulation technique for different sample sizes has been used to compare between these methods.