Background: Many types of instruments and techniques are used in the instrumentation of the root canal system. These instruments and techniques may extrude debris beyond the apical foramen and may cause post-instrumentation complications. The aim of this study was to evaluate the amount of apically extruded debris resulted by using 4 types of nickel-titanium instruments (WaveOne, TRUShape 3D conforming files, Hyflex CM, and One Shape files) during endodontic instrumentation. Materials and methods: Forty freshly extracted human mandibular second premolar with straight canals and a single apex were collected for this study. All teeth were cut to similar lengths. Pre-weighted glass vials were used as collecting containers. Samples were randomly divided into four groups with 10 samples in each group: Group A instrumentation by WaveOne reciprocating file, Group B instrumentation by TRUShape 3D rotating files, Group C instrumentation by Hyflex CM rotating files and Group D instrumentation by One Shape rotating file. A total volume of 7 ml of sodium hypochlorite was used for irrigation in each sample. Apical patency confirmed and maintained by a size #15 K-File. All canals were instrumented up to a size #25. After completion of endodontic instrumentation, vials were then stored in an incubator for 5 days at 68o C for dryness. Then vials are weighted again, and the pre-weight subtracted from the post-weight, the weight difference resembled the amount of apically extruded debris from the apical foramen during root canal instrumentation. Data obtained were statistically analysed by using ANOVA and LSD tests. Results: The results showed that the Hyflex CM Group (C) has statistical significant lowest apically extruded debris as compared to other groups of this study (P ≤0.05), while the TRUShape Group (B) has statistical significant lowest apically extruded debris as compared to One Shape Group (D) and WaveOne Group (A), while the WaveOne Group (A) showed the highest value of apically extruded debris (p ≤0.01). The result showed that all groups resulted in apical extrusion of debris. Significance: Although all systems caused apical extrusion of debris and irrigant, continuous rotary instrumentation was associated with less extrusion as compared with the use of reciprocating file system.
The goal of this research is to develop a numerical model that can be used to simulate the sedimentation process under two scenarios: first, the flocculation unit is on duty, and second, the flocculation unit is out of commission. The general equation of flow and sediment transport were solved using the finite difference method, then coded using Matlab software. The result of this study was: the difference in removal efficiency between the coded model and operational model for each particle size dataset was very close, with a difference value of +3.01%, indicating that the model can be used to predict the removal efficiency of a rectangular sedimentation basin. The study also revealed
To maintain the security and integrity of data, with the growth of the Internet and the increasing prevalence of transmission channels, it is necessary to strengthen security and develop several algorithms. The substitution scheme is the Playfair cipher. The traditional Playfair scheme uses a small 5*5 matrix containing only uppercase letters, making it vulnerable to hackers and cryptanalysis. In this study, a new encryption and decryption approach is proposed to enhance the resistance of the Playfair cipher. For this purpose, the development of symmetric cryptography based on shared secrets is desired. The proposed Playfair method uses a 5*5 keyword matrix for English and a 6*6 keyword matrix for Arabic to encrypt the alphabets of
... Show MoreIn this study, a fast block matching search algorithm based on blocks' descriptors and multilevel blocks filtering is introduced. The used descriptors are the mean and a set of centralized low order moments. Hierarchal filtering and MAE similarity measure were adopted to nominate the best similar blocks lay within the pool of neighbor blocks. As next step to blocks nomination the similarity of the mean and moments is used to classify the nominated blocks and put them in one of three sub-pools, each one represents certain nomination priority level (i.e., most, less & least level). The main reason of the introducing nomination and classification steps is a significant reduction in the number of matching instances of the pixels belong to the c
... Show MoreA new algorithm is proposed to compress speech signals using wavelet transform and linear predictive coding. Signal compression based on the concept of selecting a small number of approximation coefficients after they are compressed by the wavelet decomposition (Haar and db4) at a suitable chosen level and ignored details coefficients, and then approximation coefficients are windowed by a rectangular window and fed to the linear predictor. Levinson Durbin algorithm is used to compute LP coefficients, reflection coefficients and predictor error. The compress files contain LP coefficients and previous sample. These files are very small in size compared to the size of the original signals. Compression ratio is calculated from the size of th
... Show MoreThe aim of this research is to compare traditional and modern methods to obtain the optimal solution using dynamic programming and intelligent algorithms to solve the problems of project management.
It shows the possible ways in which these problems can be addressed, drawing on a schedule of interrelated and sequential activities And clarifies the relationships between the activities to determine the beginning and end of each activity and determine the duration and cost of the total project and estimate the times used by each activity and determine the objectives sought by the project through planning, implementation and monitoring to maintain the budget assessed
... Show Moreإنّ التّأمّل في الدّراسات الأدبيّة النّقديّة الحديثة الّتي تنظر إلى النّصّ أنّه لوحة تتشكّل من مجموعة تداخلات نصّيّة سابقة للنصّ المُنتج أو مُعاصرة له، تدفع القارئ إلى الغوص في أعماق النّصوص وتحليلها والكشف عن الآليّات والإجراءات الّتي اعتمدت في إنتاجه، وإظهار الجماليّات الفنّيّة التي شكلت في مُجملها كائنا حيّا يُعبّر عن حالات إنسانيّة مُستمرة الحدوث رغم تقادم الأزمان.
فالبحث هذا يعمد
... Show MoreAbstract
The aim of the current research is to identify the Effect of the alternative evaluation strategy on the achievement of fourth-grade female students in the subject of biology. The researchers adopted the zero hypothesis to prove the research objectives, which is there is no statistically significant difference at the level (0.05) between the average scores of the experimental group who study according to the alternative evaluation strategy and the average scores of the control group who study in accordance with the traditional method. The researchers selected the experimental partial adjustment design of the experimental and control groups with the post-test. The researchers intentionally selected (Al-fed
... Show More