Background: Radiopacity is one of the prerequisites for dental materials, especially for composite restorations. It's essential for easy detection of secondary dental caries as well as observation of the radiographic interface between the materials and tooth structure. The aim of this study to assess the difference in radiopacity of different resin composites using a digital x-ray system. Materials and methods: Ten specimens (6mm diameter and 1mm thickness) of three types of composite resins (Evetric, Estelite Sigma Quick,and G-aenial) were fabricated using Teflon mold. The radiopacity was assessed using dental radiography equipment in combination with a phosphor plate digital system and a grey scale value aluminum step wedge with thickness varying from 1mm to 10mm in steps of 1mm each. The tested materials were radiographed, we used Image J software, on a computer screen to evaluate the degree of radiopacity for each individual material and compare with the aluminum step wedge. Radiopacity was expressed in mm of equivalent aluminum step wedge. Analysis of varience (ANOVA) and Least Significant Difference (LSD) were used to investigate the significance of differences among the tested groups. Results: Statistical analysis showed highly significant difference among the tested groups (p≤0.01). Amongst, G-aenial composite shows the most radiopaque and it is above or equivalent to that of enamel, while Estelite Sigma Quick composite has the lowest radiopacity value and is equivalent to that of dentin. Conclusion: In line with previous studies, and within the limitation of our study, considerable variations in radiopacity values were found among materials depending on the radiopaque elements incorporated into the matrix. All composite materials tested complied with the ISO 4049 standard.
Reservoir characterization is an important component of hydrocarbon exploration and production, which requires the integration of different disciplines for accurate subsurface modeling. This comprehensive research paper delves into the complex interplay of rock materials, rock formation techniques, and geological modeling techniques for improving reservoir quality. The research plays an important role dominated by petrophysical factors such as porosity, shale volume, water content, and permeability—as important indicators of reservoir properties, fluid behavior, and hydrocarbon potential. It examines various rock cataloging techniques, focusing on rock aggregation techniques and self-organizing maps (SOMs) to identify specific and
... Show MoreObjective: The goal of this research is to load Doxorubicin (DOX) on silver nanoparticles coupled with folic acid and test their anticancer properties against breast cancer. Methods: Chitosan-Capped silver nanoparticles (CS-AgNPs) were manufactured and loaded with folic acid as well as an anticancer drug, Doxorubicin, to form CS-AgNPs-DOX-FA conjugate. AFM, FTIR, and SEM techniques were used to characterize the samples. The produced multifunctional nano-formulation served as an intrinsic drug delivery system, allowing for effective loading and targeting of chemotherapeutics on the Breast cancer (AMJ 13) cell line. Flowcytometry was used to assess therapy efficacy by measuring apoptotic induction. Results: DOX and CS-Ag
... Show MoreA cut-off low is a closed low with a low value of geopotential height at the upper atmospheric levels that has been fully detached (cut-off) from the westerly flow and move independently. A cut-off low causes extreme rainfall events in the mid-latitudes regions. The main aim of this paper is to investigate the cut-off low at 500 hPa over Iraq from a synoptic point of view and the behavior of geopotential height at 500 hPa. To examine the association of the cut-off low at 500 hPa with rainfall events across Iraq, two case studies of heavy rainfall events from different times were conducted. The results showed that the cut-off low at 500 hPa with a low value of geopotential height will strengthen the low-pressure system at the surface, lea
... Show MoreThe objective of an Optimal Power Flow (OPF) algorithm is to find steady state operation point which minimizes generation cost, loss etc. while maintaining an acceptable system performance in terms of limits on generators real and reactive powers, line flow limits etc. The OPF solution includes an objective function. A common objective function concerns the active power generation cost. A Linear programming method is proposed to solve the OPF problem. The Linear Programming (LP) approach transforms the nonlinear optimization problem into an iterative algorithm that in each iteration solves a linear optimization problem resulting from linearization both the objective function and constrains. A computer program, written in MATLAB environme
... Show MoreThis paper introduces a non-conventional approach with multi-dimensional random sampling to solve a cocaine abuse model with statistical probability. The mean Latin hypercube finite difference (MLHFD) method is proposed for the first time via hybrid integration of the classical numerical finite difference (FD) formula with Latin hypercube sampling (LHS) technique to create a random distribution for the model parameters which are dependent on time [Formula: see text]. The LHS technique gives advantage to MLHFD method to produce fast variation of the parameters’ values via number of multidimensional simulations (100, 1000 and 5000). The generated Latin hypercube sample which is random or non-deterministic in nature is further integ
... Show MoreData scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for
Is the subject of the financial structure of the most important topics for which she received the interests of scientific research in the field of financial management , as it emerged several theories about choosing a financial structure appropriate for the facility and behavior change funding them , and in spite of that there is no agreement on a specific theory answer various questions in this regard , and a special issue of the financial structure optimization.
The objective of the research was to identify the most important theories of the structure of modern financial theory has been to focus on the capture of financial firms in two different stages of their life cycle , so-called growth and ma
... Show More