Letrozole (LZL) is a non-steroidal competitive aromatase enzyme system inhibitor. The aim of this study is to improve the permeation of LZL through the skin by preparing as nanoemulsion using various numbers of oils, surfactants and co-surfactant with deionized water. Based on solubility studies, mixtures of oleic acid oil and tween 80/ transcutol p as surfactant/co-surfactant (Smix) in different percentages were used to prepare nanoemulsions (NS). Therefore, 9 formulae of (o/w) LZL NS were formulated, then pseudo-ternary phase diagram was used as a useful tool to evaluate the NS domain at Smix ratios: 1:1, 2:1 and 3:1.
A new data for Fusion power density has been obtained for T-3He and T-T fusion reactions, power density is a substantial term in the researches related to the fusion energy generation and ignition calculations of magnetic confined systems. In the current work, thermal nuclear reactivities, power densities of a fusion reactors and the ignition condition inquiry are achieved by using a new and accurate formula of cross section, the maximum values of fusion power density for T-3He and TT reaction are 1.1×107 W/m3 at T=700 KeV and 4.7×106 W/m3 at T=500 KeV respectively, While Zeff suggested to be 1.44 for the two reactions. Bremsstrahlung radiation has also been determined to reaching self- sustaining reactors, Bremsstrahlung values are 4.5×
... Show MoreThe research aims to explain the References of receiving the explicit and implicit knowledge mentioned in the Noble Qur'an. The two researchers adopted the documentary and inductive approach to study the topic. Among the conclusions of the research: The Noble Qur'an dealt with many terms and concepts that refer to the References of making explicit knowledge available, including books, which were represented by the divine books (the Qur'an, the Torah, the Zabur and the Gospel) and their concepts (the book, the Qur'an, the guidance, the remembrance, the revelation, the light, the newspapers, the plates). He dealt with many concepts that refer to the References of providing tacit knowledge, which was represented by the communication between tw
... Show MoreIn some cases, surgeons need to navigate through the computer system for reconfirmation patients’ details and unfortunately surgeons unable to manage both computer system and operation at the same time. In this paper we propose a solution for this problem especially designed for heart surgeon, by introducing voice activation system with 3D visualization of Angiographic images, 2D visualization of Echocardiography processed video and selected patient’s details. In this study, the processing, approximation of the 3D angiography and the visualization of the 2D echocardiography video with voice recognition control are the most challenging work. The work involve with predicting 3D coronary three from 2D angiography image and also image enhan
... Show MoreCopula modeling is widely used in modern statistics. The boundary bias problem is one of the problems faced when estimating by nonparametric methods, as kernel estimators are the most common in nonparametric estimation. In this paper, the copula density function was estimated using the probit transformation nonparametric method in order to get rid of the boundary bias problem that the kernel estimators suffer from. Using simulation for three nonparametric methods to estimate the copula density function and we proposed a new method that is better than the rest of the methods by five types of copulas with different sample sizes and different levels of correlation between the copula variables and the different parameters for the function. The
... Show MoreGraphite nanoparticles were successfully synthesized using mixture of H2O2/NH4OH with three steps of oxidation. The process of oxidations were analysis by XRD and optics microscopic images which shows clear change in particle size of graphite after every steps of oxidation. The method depend on treatments the graphite with H2O2 in two steps than complete the last steps by reacting with H2O2/NH4OH with equal quantities. The process did not reduces the several sheets for graphite but dispersion the aggregates of multi-sheets carbon when removed the Van Der Waals forces through the oxidation process.
Metaheuristic is one of the most well-known fields of research used to find optimum solutions for non-deterministic polynomial hard (NP-hard) problems, for which it is difficult to find an optimal solution in a polynomial time. This paper introduces the metaheuristic-based algorithms and their classifications and non-deterministic polynomial hard problems. It also compares the performance of two metaheuristic-based algorithms (Elephant Herding Optimization algorithm and Tabu Search) to solve the Traveling Salesman Problem (TSP), which is one of the most known non-deterministic polynomial hard problems and widely used in the performance evaluations for different metaheuristics-based optimization algorithms. The experimental results of Ele
... Show MoreIn this research, we built a program to assess Weibull parameters and wind power of three separate locations in Iraq: Baghdad, Basrah and Dhi-qar for two years 2009 and 2010, after collecting and setting the data available from the website "Weather Under Ground" for each of the stations Baghdad, Basrah and Dhi-qar. Weibull parameters (shape parameter and scale parameter) were estimated using maximum likelihood estimation method (MLE) and least squares method (LSM). Also, the annual wind speed frequencies were calculated noting speed most readily available through the above two years. Then, we plotted Weibull distribution function and calculate the most significant quantities represented by mean wind speed, standard deviation of the value
... Show MoreThe recent emergence of sophisticated Large Language Models (LLMs) such as GPT-4, Bard, and Bing has revolutionized the domain of scientific inquiry, particularly in the realm of large pre-trained vision-language models. This pivotal transformation is driving new frontiers in various fields, including image processing and digital media verification. In the heart of this evolution, our research focuses on the rapidly growing area of image authenticity verification, a field gaining immense relevance in the digital era. The study is specifically geared towards addressing the emerging challenge of distinguishing between authentic images and deep fakes – a task that has become critically important in a world increasingly reliant on digital med
... Show More