Background: The human face has its special characteristics. It may be categorized into essentially three kinds in horizontal and vertical directions: short or brachyfacial, medium or mesofacial and long or dolichofacial. The aim of this study was to describe several orofacial indices and proportions of adults, according to gender in Iraqi subjects by using cone beam computed tomography . materials and methods: This prospective study included 100 Iraqi patients (males and females) ranging from 20 to 40 years. All subjects attended the Oral and Maxillofacial Radiology Department of Health Specialist Center for Dentistry in AL Sadr city in Baghdad taking cone beam computed tomography scan for different diagnostic purposes from October 2016 to May 2017. The facial index was used for determination of facial type. Subjects were divided according to gender and then by photoshope cs4 programe ,five variables were calculated : inferior face index, superior face index, chin-face proportion, chin proportion and mandibular proportion. Results: The average values obtained from the subjects divided according to genders that presented significant diffe¬rences were: inferior face index and superior face index for males and inferior face index, superior face index and mandibular proportion for females. Conclusion: There were variation in some indices and proportions according to genders. In males groupe, inferior face index and superior face index had significant difference between the average of the facial types, while in female group, inferior face index, superior face index and mandibular proportion had significant difference between the average of the facial types
Facial emotion recognition finds many real applications in the daily life like human robot interaction, eLearning, healthcare, customer services etc. The task of facial emotion recognition is not easy due to the difficulty in determining the effective feature set that can recognize the emotion conveyed within the facial expression accurately. Graph mining techniques are exploited in this paper to solve facial emotion recognition problem. After determining positions of facial landmarks in face region, twelve different graphs are constructed using four facial components to serve as a source for sub-graphs mining stage using gSpan algorithm. In each group, the discriminative set of sub-graphs are selected and fed to Deep Belief Network (DBN) f
... Show MoreBackground: peritoneal cavity can be involved in inflammatory and malignant diseases and using computed tomography (CT) findings of exudative ascites may help in the differentiation.
Objectives: 1-Describe CT features in patients with exudative ascites.2-Obtain useful CT findings to differentiate between tuberculous (TB) peritonitis and peritoneal carcinomatosis.
Patients &methods: A cross sectional study conducted in Medical City Teaching Complex from September 2009 to September 2010 studied patients with exudative ascites using CT scan and confirmed later with histopathology examination. CT scan results were presented according to cytology examination and biochemical analysis.
Results: 35 patients with exudative ascites we
Background: ultrasound offers non-invasive, rapid and simple method for confirming the clinical diagnosis of maxillary sinus pathologies.
Objective: to evaluate the accuracy of real time ultrasound compared with the computed tomography in evaluation of maxillary sinusitis.
Patients and materials: This comparative cross-sectional study was done on 42 patients referred for computed tomography examination of paranasal sinuses in Al-Yarmook Teaching Hospital-Baghdad, from October 2012 to February 2013 with patients clinically suggesting an underlying maxillary sinusitis. Ultrasound and computed tomography examinations were carried out on the same day, the ultrasound being the first investigation. The sample of this study consisted of 2
Recognizing facial expressions and emotions is a basic skill that is learned at an early age and it is important for human social interaction. Facial expressions are one of the most powerful natural and immediate means that humans use to express their feelings and intentions. Therefore, automatic emotion recognition based on facial expressions become an interesting area in research, which had been introduced and applied in many areas such as security, safety health, and human machine interface (HMI). Facial expression recognition transition from controlled environmental conditions and their improvement and succession of recent deep learning approaches from different areas made facial expression representation mostly based on u
... Show MoreBackground: Consideration of mandibular third molar is important from orthodontic perspective due to several factors such as, lower anterior arch crowding, relapse in lower anterior region, interference with uprighting of mandibular first and second molars during anchorage preparation and molar distalization. The aims of this study were to assess of gender differences in the mandibular third molar position and compare and evaluate whether there is any differences in the results provided by CT scan and lateral reconstructed radiograph. Materials and Methods: The sample of present study consisted of 39 patients (18 males and 21 females) with age range 11-15 years. CT images for patients who were attending at Al Suwayra General Hospital/the C
... Show MoreBackground: Determination of local bone mineral density (BMD) immediately after implant insertion play an important role in implant success rate, may offer comprehensive description of the bone, and give enough information to the surgeon prior to implant insertion and at follow up status. The aim of the present study is to evaluate the changes of local bone density in the dental implant recipient sites by using computerized tomography. Material and method: The sample consisted of (20) dental implants recipient sites, bone density assessment was done twice, immediately after implants insertion and after six months. Results: The mean HU of the bone around the implant insertion site, immediately after implant placement was 552.28 HU, and inc
... Show More