Preferred Language
Articles
/
iRb2XIcBVTCNdQwC9kdb
Synthesis and Preliminary Evaluation of a Polyolefin-based Core for Carrier-based Root Canal Obturation
...Show More Authors

Introduction: Carrier-based gutta-percha is an effective method of root canal obturation creating a 3-dimensional filling; however, retrieval of the plastic carrier is relatively difficult, particularly with smaller sizes. The purpose of this study was to develop composite carriers consisting of polyethylene (PE), hydroxyapatite (HA), and strontium oxide (SrO) for carrier-based root canal obturation. Methods: Composite fibers of HA, PE, and SrO were fabricated in the shape of a carrier for delivering gutta-percha (GP) using a melt-extrusion process. The fibers were characterized using infrared spectroscopy and the thermal properties determined using differential scanning calorimetry. The elastic modulus and tensile strength tests were determined using a universal testing machine. The radiographic appearance was established using digital periapical radiographs. Results: The composite core carrier exhibited a melting point of 111°C to 112°C, which would facilitate removal by heat application. The elastic modulus and the tensile strength were found to be lower than those of Thermafil carriers (Dentsply Tulsa Dental, Tulsa, OK). The preliminary radiographic evaluation showed that the novel composite core carrier is sufficiently radiopaque and can be distinguished from gutta-percha. Conclusions: The PE-HA-SrO composites were successfully melt processed into composite core carriers for delivering gutta-percha into the root canal space.

Scopus Clarivate Crossref
View Publication
Publication Date
Thu Jan 01 2015
Journal Name
Journal Of Engineering
GNSS Baseline Configuration Based on First Order Design
...Show More Authors

The quality of Global Navigation Satellite Systems (GNSS) networks are considerably influenced by the configuration of the observed baselines. Where, this study aims to find an optimal configuration for GNSS baselines in terms of the number and distribution  of baselines to improve the quality criteria of the GNSS networks. First order design problem (FOD) was applied in this research to optimize GNSS network baselines configuration, and based on sequential adjustment method to solve its objective functions.

FOD for optimum precision (FOD-p) was the proposed model which based on the design criteria of A-optimality and E-optimality. These design criteria were selected as objective functions of precision, whic

... Show More
View Publication
Publication Date
Wed Sep 01 2021
Journal Name
Baghdad Science Journal
Optimum Median Filter Based on Crow Optimization Algorithm
...Show More Authors

          A novel median filter based on crow optimization algorithms (OMF) is suggested to reduce the random salt and pepper noise and improve the quality of the RGB-colored and gray images. The fundamental idea of the approach is that first, the crow optimization algorithm detects noise pixels, and that replacing them with an optimum median value depending on a criterion of maximization fitness function. Finally, the standard measure peak signal-to-noise ratio (PSNR), Structural Similarity, absolute square error and mean square error have been used to test the performance of suggested filters (original and improved median filter) used to removed noise from images. It achieves the simulation based on MATLAB R2019b and the resul

... Show More
View Publication Preview PDF
Scopus (6)
Crossref (2)
Scopus Clarivate Crossref
Publication Date
Sun Jan 20 2019
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Text Classification Based on Weighted Extreme Learning Machine
...Show More Authors

The huge amount of documents in the internet led to the rapid need of text classification (TC). TC is used to organize these text documents. In this research paper, a new model is based on Extreme Machine learning (EML) is used. The proposed model consists of many phases including: preprocessing, feature extraction, Multiple Linear Regression (MLR) and ELM. The basic idea of the proposed model is built upon the calculation of feature weights by using MLR. These feature weights with the extracted features introduced as an input to the ELM that produced weighted Extreme Learning Machine (WELM). The results showed   a great competence of the proposed WELM compared to the ELM. 

View Publication Preview PDF
Crossref (3)
Crossref
Publication Date
Wed Jun 01 2011
Journal Name
Journal Of Al-nahrain University Science
Breaking Knapsack Cipher Using Population Based Incremental Learning
...Show More Authors

View Publication
Crossref
Publication Date
Fri Sep 23 2022
Journal Name
Specialusis Ugdymas
Text Cryptography based on Arabic Words Characters Number
...Show More Authors

Cryptography is a method used to mask text based on any encryption method, and the authorized user only can decrypt and read this message. An intruder tried to attack in many manners to access the communication channel, like impersonating, non-repudiation, denial of services, modification of data, threatening confidentiality and breaking availability of services. The high electronic communications between people need to ensure that transactions remain confidential. Cryptography methods give the best solution to this problem. This paper proposed a new cryptography method based on Arabic words; this method is done based on two steps. Where the first step is binary encoding generation used t

... Show More
Publication Date
Sun Sep 24 2023
Journal Name
Journal Of Al-qadisiyah For Computer Science And Mathematics
Iris Data Compression Based on Hexa-Data Coding
...Show More Authors

Iris research is focused on developing techniques for identifying and locating relevant biometric features, accurate segmentation and efficient computation while lending themselves to compression methods. Most iris segmentation methods are based on complex modelling of traits and characteristics which, in turn, reduce the effectiveness of the system being used as a real time system. This paper introduces a novel parameterized technique for iris segmentation. The method is based on a number of steps starting from converting grayscale eye image to a bit plane representation, selection of the most significant bit planes followed by a parameterization of the iris location resulting in an accurate segmentation of the iris from the origin

... Show More
View Publication
Crossref
Publication Date
Wed Mar 08 2017
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Genetic –Based Face Retrieval Using Statistical Features
...Show More Authors

Content-based image retrieval has been keenly developed in numerous fields. This provides more active management and retrieval of images than the keyword-based method. So the content based image retrieval becomes one of the liveliest researches in the past few years. In a given set of objects, the retrieval of information suggests solutions to search for those in response to a particular description. The set of objects which can be considered are documents, images, videos, or sounds. This paper proposes a method to retrieve a multi-view face from a large face database according to color and texture attributes. Some of the features used for retrieval are color attributes such as the mean, the variance, and the color image's bitmap. In add

... Show More
View Publication Preview PDF
Publication Date
Sat Jun 26 2021
Journal Name
2021 Ieee International Conference On Automatic Control & Intelligent Systems (i2cacis)
Vulnerability Assessment on Ethereum Based Smart Contract Applications
...Show More Authors

View Publication
Scopus (9)
Crossref (5)
Scopus Clarivate Crossref
Publication Date
Sat Jul 31 2021
Journal Name
Brain Sciences
Robust EEG Based Biomarkers to Detect Alzheimer’s Disease
...Show More Authors

Biomarkers to detect Alzheimer’s disease (AD) would enable patients to gain access to appropriate services and may facilitate the development of new therapies. Given the large numbers of people affected by AD, there is a need for a low-cost, easy to use method to detect AD patients. Potentially, the electroencephalogram (EEG) can play a valuable role in this, but at present no single EEG biomarker is robust enough for use in practice. This study aims to provide a methodological framework for the development of robust EEG biomarkers to detect AD with a clinically acceptable performance by exploiting the combined strengths of key biomarkers. A large number of existing and novel EEG biomarkers associated with slowing of EEG, reductio

... Show More
View Publication Preview PDF
Scopus (26)
Crossref (25)
Scopus Clarivate Crossref
Publication Date
Sun Nov 01 2020
Journal Name
Journal Of Physics: Conference Series
Improve topic modeling algorithms based on Twitter hashtags
...Show More Authors
Abstract<p>Today with increase using social media, a lot of researchers have interested in topic extraction from Twitter. Twitter is an unstructured short text and messy that it is critical to find topics from tweets. While topic modeling algorithms such as Latent Semantic Analysis (LSA) and Latent Dirichlet Allocation (LDA) are originally designed to derive topics from large documents such as articles, and books. They are often less efficient when applied to short text content like Twitter. Luckily, Twitter has many features that represent the interaction between users. Tweets have rich user-generated hashtags as keywords. In this paper, we exploit the hashtags feature to improve topics learned</p> ... Show More
View Publication
Scopus (18)
Crossref (16)
Scopus Crossref