The aim of the current study was to develop a nanostructured double-layer for hydrophobic molecules delivery system. The developed double-layer consisted of polyethylene glycol-based polymeric (PEG) followed by gelatin sub coating of the core hydrophobic molecules containing sodium citrate. The polymeric composition ratio of PEG and the amount of the sub coating gelatin were optimized using the two-level fractional method. The nanoparticles were characterized using AFM and FT-IR techniques. The size of these nano capsules was in the range of 39-76 nm depending on drug loading concentration. The drug was effectively loaded into PEG-Gelatin nanoparticles (≈47%). The hydrophobic molecules-release characteristics in terms of controlled-release duration and dissolution efficiency were examined in various dissolution media, such as physiological pH (7.4) and simulated stomach fluid (3.4). Consequently, the optimized double-layer for hydrophobic molecules delivery system showed a gradual release of hydrophobic molecules in the and in physiological pH, indicating its novelty for using as a platform for hydrophobic molecules delivery.
This research aims to analyze and simulate biochemical real test data for uncovering the relationships among the tests, and how each of them impacts others. The data were acquired from Iraqi private biochemical laboratory. However, these data have many dimensions with a high rate of null values, and big patient numbers. Then, several experiments have been applied on these data beginning with unsupervised techniques such as hierarchical clustering, and k-means, but the results were not clear. Then the preprocessing step performed, to make the dataset analyzable by supervised techniques such as Linear Discriminant Analysis (LDA), Classification And Regression Tree (CART), Logistic Regression (LR), K-Nearest Neighbor (K-NN), Naïve Bays (NB
... Show MoreThe Artificial Neural Network methodology is a very important & new subjects that build's the models for Analyzing, Data Evaluation, Forecasting & Controlling without depending on an old model or classic statistic method that describe the behavior of statistic phenomenon, the methodology works by simulating the data to reach a robust optimum model that represent the statistic phenomenon & we can use the model in any time & states, we used the Box-Jenkins (ARMAX) approach for comparing, in this paper depends on the received power to build a robust model for forecasting, analyzing & controlling in the sod power, the received power come from
... Show MoreAbstract:
The great importance that distinguish these factorial experiments made them subject a desirable for use and application in many fields, particularly in the field of agriculture, which is considered the broad area for experimental designs applications.
And the second case for the factorial experiment, which faces researchers have great difficulty in dealing with the case unbalance we mean that frequencies treatments factorial are not equal meaning (that is allocated a number unequal of blocks or units experimental per tre
... Show MoreNon-orthogonal Multiple Access (NOMA) is a multiple-access technique allowing multiusers to share the same communication resources, increasing spectral efficiency and throughput. NOMA has been shown to provide significant performance gains over orthogonal multiple access (OMA) regarding spectral efficiency and throughput. In this paper, two scenarios of NOMA are analyzed and simulated, involving two users and multiple users (four users) to evaluate NOMA's performance. The simulated results indicate that the achievable sum rate for the two users’ scenarios is 16.7 (bps/Hz), while for the multi-users scenario is 20.69 (bps/Hz) at transmitted power of 25 dBm. The BER for two users’ scenarios is 0.004202 and 0.001564 for
... Show MoreKrawtchouk polynomials (KPs) and their moments are promising techniques for applications of information theory, coding theory, and signal processing. This is due to the special capabilities of KPs in feature extraction and classification processes. The main challenge in existing KPs recurrence algorithms is that of numerical errors, which occur during the computation of the coefficients in large polynomial sizes, particularly when the KP parameter (p) values deviate away from 0.5 to 0 and 1. To this end, this paper proposes a new recurrence relation in order to compute the coefficients of KPs in high orders. In particular, this paper discusses the development of a new algorithm and presents a new mathematical model for computing the
... Show MoreVisual analytics becomes an important approach for discovering patterns in big data. As visualization struggles from high dimensionality of data, issues like concept hierarchy on each dimension add more difficulty and make visualization a prohibitive task. Data cube offers multi-perspective aggregated views of large data sets and has important applications in business and many other areas. It has high dimensionality, concept hierarchy, vast number of cells, and comes with special exploration operations such as roll-up, drill-down, slicing and dicing. All these issues make data cubes very difficult to visually explore. Most existing approaches visualize a data cube in 2D space and require preprocessing steps. In this paper, we propose a visu
... Show More