Background. The presence of black triangles around the dental implant-supported prosthesis and the failure to construct adequate papillae around them bothers dental implantologists. Peri-implant surgical soft tissue management will improve esthetics, function, and implant survival. Aim. To compare the effects of rolled and nonrolled U-shaped flaps combined with a temporary crown in enhancing the soft tissue around dental implants. Materials and Methods. Forty patients were included in this study; all patients were operated on by the same maxillofacial surgeon at Al-Iraq specialized dental clinics from January 2019 to January 2020. Patients were divided randomly into two groups: group A: at the second stage of implant surgery, a U-shaped flap without rolling was used in conjunction with temporary crown placement; group B: at the second stage of implant surgery, a U-shaped flap with rolling was used in conjunction with temporary crown placement. Then, the temporary crown was fabricated for both groups and kept in place for one month. Two independent maxillofacial surgeons evaluated all patients two weeks after the cementation of the final zirconia crown for the implant soft tissue esthetic score. Results. The highest possible score assigned to the mesial papilla (2 scores) was present in 92.5% of the group A patients and only 77.5% of the group B patients. Moreover, we have found that alveolar bone contour is achieved perfectly (2 scores) in 70% of group A patients but only in 32.5% of group B patients. Conclusion. The U-shaped flap without rolling with a temporary crown is a simple technique. It has good results, especially when there is no severe resorption of the labial bone plate (in canine and premolar areas).
Moment invariants have wide applications in image recognition since they were proposed.
In 2020 one of the researchers in this paper, in his first research, tried to find out the Modified Weighted Pareto Distribution of Type I by using the Azzalini method for weighted distributions, which contain three parameters, two of them for scale while the third for shape.This research compared the distribution with two other distributions from the same family; the Standard Pareto Distribution of Type I and the Generalized Pareto Distribution by using the Maximum likelihood estimator which was derived by the researchers for Modified Weighted Pareto Distribution of Type I, then the Mont Carlo method was used–that is one of the simulation manners for generating random samples data in different sizes ( n= 10,30,50), and in di
... Show MoreThis work presents a comparison between the Convolutional Encoding CE, Parallel Turbo code and Low density Parity Check (LDPC) coding schemes with a MultiUser Single Output MUSO Multi-Carrier Code Division Multiple Access (MC-CDMA) system over multipath fading channels. The decoding technique used in the simulation was iterative decoding since it gives maximum efficiency at higher iterations. Modulation schemes used is Quadrature Amplitude Modulation QAM. An 8 pilot carrier were
used to compensate channel effect with Least Square Estimation method. The channel model used is Long Term Evolution (LTE) channel with Technical Specification TS 25.101v2.10 and 5 MHz bandwidth bandwidth including the channels of indoor to outdoor/ pedestrian
The precise classification of DNA sequences is pivotal in genomics, holding significant implications for personalized medicine. The stakes are particularly high when classifying key genetic markers such as BRAC, related to breast cancer susceptibility; BRAF, associated with various malignancies; and KRAS, a recognized oncogene. Conventional machine learning techniques often necessitate intricate feature engineering and may not capture the full spectrum of sequence dependencies. To ameliorate these limitations, this study employs an adapted UNet architecture, originally designed for biomedical image segmentation, to classify DNA sequences.The attention mechanism was also tested LONG WITH u-Net architecture to precisely classify DNA sequences
... Show MoreBackground: Osteoporosis is denoted by low bone mass and microarchitectural breakdown of bone tissue, directing to increased fracture risk and bone fragility. Fractures may lead to a decreased quality of life and increased medical costs. Thus, osteoporosis is widely considered a significant health concern.
Objective. This study aimed to compare quantitative computed tomography (QCT) and dual-energy X-Ray absorptiometry (DXA) to detect osteoporosis in postmenopausal women.
Subjects and Methods. We measured spinal volumetric bone mineral density (BMD) with QCT and areal spinal and hip BMD with DXA in 164 postmenopausal women. We calculated the osteo
... Show MoreThe search included a comparison between two etchands for etch CR-39 nuclear track detector, by the calculation of bulk etch rate (Vb) which is one of the track etching parameters, by two measuring methods (thichness and change mass). The first type, is the solution prepared from solving NaOH in Ethanol (NaOH/Ethanol) by varied normalities under temperature(55˚C)and etching time (30 min) then comparated with the second type the solution prepared from solving NaOH in water (NaOH/Water) by varied normalities with (70˚C) and etching time (60 min) . All detectors were irradiated with (5.48 Mev) α-Particles from an 241Am source in during (10 min). The results that Vb would increase with the increase of
... Show MoreAbstract
The Non - Homogeneous Poisson process is considered as one of the statistical subjects which had an importance in other sciences and a large application in different areas as waiting raws and rectifiable systems method , computer and communication systems and the theory of reliability and many other, also it used in modeling the phenomenon that occurred by unfixed way over time (all events that changed by time).
This research deals with some of the basic concepts that are related to the Non - Homogeneous Poisson process , This research carried out two models of the Non - Homogeneous Poisson process which are the power law model , and Musa –okumto , to estimate th
... Show MoreMultilevel models are among the most important models widely used in the application and analysis of data that are characterized by the fact that observations take a hierarchical form, In our research we examined the multilevel logistic regression model (intercept random and slope random model) , here the importance of the research highlights that the usual regression models calculate the total variance of the model and its inability to read variance and variations between levels ,however in the case of multi-level regression models, the calculation of the total variance is inaccurate and therefore these models calculate the variations for each level of the model, Where the research aims to estimate the parameters of this m
... Show More