Background: Many types of instruments and techniques are used in the instrumentation of the root canal system. These instruments and techniques may extrude debris beyond the apical foramen and may cause post-instrumentation complications. The aim of this study was to evaluate the amount of apically extruded debris resulted by using 4 types of nickel-titanium instruments (WaveOne, TRUShape 3D conforming files, Hyflex CM, and One Shape files) during endodontic instrumentation. Materials and methods: Forty freshly extracted human mandibular second premolar with straight canals and a single apex were collected for this study. All teeth were cut to similar lengths. Pre-weighted glass vials were used as collecting containers. Samples were randomly divided into four groups with 10 samples in each group: Group A instrumentation by WaveOne reciprocating file, Group B instrumentation by TRUShape 3D rotating files, Group C instrumentation by Hyflex CM rotating files and Group D instrumentation by One Shape rotating file. A total volume of 7 ml of sodium hypochlorite was used for irrigation in each sample. Apical patency confirmed and maintained by a size #15 K-File. All canals were instrumented up to a size #25. After completion of endodontic instrumentation, vials were then stored in an incubator for 5 days at 68o C for dryness. Then vials are weighted again, and the pre-weight subtracted from the post-weight, the weight difference resembled the amount of apically extruded debris from the apical foramen during root canal instrumentation. Data obtained were statistically analysed by using ANOVA and LSD tests. Results: The results showed that the Hyflex CM Group (C) has statistical significant lowest apically extruded debris as compared to other groups of this study (P ≤0.05), while the TRUShape Group (B) has statistical significant lowest apically extruded debris as compared to One Shape Group (D) and WaveOne Group (A), while the WaveOne Group (A) showed the highest value of apically extruded debris (p ≤0.01). The result showed that all groups resulted in apical extrusion of debris. Significance: Although all systems caused apical extrusion of debris and irrigant, continuous rotary instrumentation was associated with less extrusion as compared with the use of reciprocating file system.
The monogeneans Gyrodactylus dzhalilovi Ergens & Ashurova, 1984, G. magnus Konovalov, 1967 and G. matovi Ergens & Kakachava-Avramova, 1966 were recorded in this study for the first time in Iraq from gills of the common carp Cyprinus carpio Linnaeus, 1758 collected from Tigris River in Baghdad city. The description, measurements and illustrations of these parasites were given.
The research aims to determine the role of the practices of green human resources management in achieving requirements of environmental citizenship in the workplace, the General Company for Vegetable Oils was chosen for the application of field-side of research which represent one of the important industrial companies in Iraq, which suffers from poor Green human resources management applications, which reflected negatively on the development Environmental citizenship among Employees. The questionnaire use as a tool to collect data and information as well as field presence of the researcher, The research sample included (30) managers of departments and Division ,and through using statistical program (SPSS) the data has been analys
... Show MoreABSTRACT : Bacillus cereus and Pseudomonas aeruginosa is the ability to produce a wide antimicrobial active compounds (Bacillin and S-Pyocin) against pathogenic microorganism. In vitro assay with the antagonists of both crude bacteriocin and partial by precipitation 75% ammonium sulfate showed that the effectively inhibited growth of the following (Candida kefyer and Fusarium spp) and Propionibacterium acnes. The results showed the inhibition zone of reached Bacillin (9-13 mm), while Pyocin (13 - 16mm) in solid medium.
Compressing the speech reduces the data storage requirements, leading to reducing the time of transmitting the digitized speech over long-haul links like internet. To obtain best performance in speech compression, wavelet transforms require filters that combine a number of desirable properties, such as orthogonality and symmetry.The MCT bases functions are derived from GHM bases function using 2D linear convolution .The fast computation algorithm methods introduced here added desirable features to the current transform. We further assess the performance of the MCT in speech compression application. This paper discusses the effect of using DWT and MCT (one and two dimension) on speech compression. DWT and MCT performances in terms of comp
... Show MoreAs we live in the era of the fourth technological revolution, it has become necessary to use artificial intelligence to generate electric power through sustainable solar energy, especially in Iraq and what it has gone through in terms of crises and what it suffers from a severe shortage of electric power because of the wars and calamities it went through. During that period of time, its impact is still evident in all aspects of daily life experienced by Iraqis because of the remnants of wars, siege, terrorism, wrong policies ruling before and later, regional interventions and their consequences, such as the destruction of electric power stations and the population increase, which must be followed by an increase in electric power stations,
... Show MoreEye Detection is used in many applications like pattern recognition, biometric, surveillance system and many other systems. In this paper, a new method is presented to detect and extract the overall shape of one eye from image depending on two principles Helmholtz & Gestalt. According to the principle of perception by Helmholz, any observed geometric shape is perceptually "meaningful" if its repetition number is very small in image with random distribution. To achieve this goal, Gestalt Principle states that humans see things either through grouping its similar elements or recognize patterns. In general, according to Gestalt Principle, humans see things through genera
... Show MoreShadow detection and removal is an important task when dealing with color outdoor images. Shadows are generated by a local and relative absence of light. Shadows are, first of all, a local decrease in the amount of light that reaches a surface. Secondly, they are a local change in the amount of light rejected by a surface toward the observer. Most shadow detection and segmentation methods are based on image analysis. However, some factors will affect the detection result due to the complexity of the circumstances. In this paper a method of segmentation test present to detect shadows from an image and a function concept is used to remove the shadow from an image.
Neural cryptography deals with the problem of “key exchange” between two neural networks by using the mutual learning concept. The two networks exchange their outputs (in bits) and the key between two communicating parties ar eventually represented in the final learned weights, when the two networks are said to be synchronized. Security of neural synchronization is put at risk if an attacker is capable of synchronizing with any of the two parties during the training process.
The penalized least square method is a popular method to deal with high dimensional data ,where the number of explanatory variables is large than the sample size . The properties of penalized least square method are given high prediction accuracy and making estimation and variables selection
At once. The penalized least square method gives a sparse model ,that meaning a model with small variables so that can be interpreted easily .The penalized least square is not robust ,that means very sensitive to the presence of outlying observation , to deal with this problem, we can used a robust loss function to get the robust penalized least square method ,and get robust penalized estimator and
... Show MoreDiabetes is one of the increasing chronic diseases, affecting millions of people around the earth. Diabetes diagnosis, its prediction, proper cure, and management are compulsory. Machine learning-based prediction techniques for diabetes data analysis can help in the early detection and prediction of the disease and its consequences such as hypo/hyperglycemia. In this paper, we explored the diabetes dataset collected from the medical records of one thousand Iraqi patients. We applied three classifiers, the multilayer perceptron, the KNN and the Random Forest. We involved two experiments: the first experiment used all 12 features of the dataset. The Random Forest outperforms others with 98.8% accuracy. The second experiment used only five att
... Show More