The objective of this study is to determine the efficacy of class V Er:YAG laser (2940 nm) cavity preparation and conventional bur cavity preparation regarding Intrapulpal temperature rise during cavity preparation in extracted human premolar teeth. Twenty non carious premolar teeth extracted for orthodontic purposes were used and class V cavity preparation was applied both buccal and lingual sides for each tooth .Samples were equally grouped into two major groups according to cavity depth (1mm and 2mm). Each major group was further subdivided into two subgroupsof ten teeth for each (twenty cavities for each subgroup). TwinlightEr:YAG laser (2940 nm) with 500mJ pulse energy, P.R.R of 10 Hz and 63.69 J/cm2 energy density was used. The analysis of the data collected revealed that there was highly significant difference between subgroups of each group, i.e., (Er:YAG laser and conventional bur cavity preparation). Also there was a highly significant difference between both group1 and group 2 subgroups (with 1mm and 2mm cavity depth). Best results were obtained from subgroup A which represents class V cavities prepared using Er:YAG laser with energy density of 63.69 J/cm2 .Er:YAG laser cavity preparation with energy density of 63.69 J/cm2 was less temperature rise than conventional bur cavity preparation taking into account the invitro temperature rise of class V cavity preparation.
In this work, an analytical approximation solution is presented, as well as a comparison of the Variational Iteration Adomian Decomposition Method (VIADM) and the Modified Sumudu Transform Adomian Decomposition Method (M STADM), both of which are capable of solving nonlinear partial differential equations (NPDEs) such as nonhomogeneous Kertewege-de Vries (kdv) problems and the nonlinear Klein-Gordon. The results demonstrate the solution’s dependability and excellent accuracy.
Recently, Image enhancement techniques can be represented as one of the most significant topics in the field of digital image processing. The basic problem in the enhancement method is how to remove noise or improve digital image details. In the current research a method for digital image de-noising and its detail sharpening/highlighted was proposed. The proposed approach uses fuzzy logic technique to process each pixel inside entire image, and then take the decision if it is noisy or need more processing for highlighting. This issue is performed by examining the degree of association with neighboring elements based on fuzzy algorithm. The proposed de-noising approach was evaluated by some standard images after corrupting them with impulse
... Show MoreLet be a ring. Given two positive integers and , an module is said to be -presented, if there is an exact sequence of -modules with is -generated. A submodule of a right -module is said to be -pure in , if for every -Presented left -module the canonical map is a monomorphism. An -module has the -pure intersection property if the intersection of any two -pure submodules is again -pure. In this paper we give some characterizations, theorems and properties of modules with the -pure intersection property.
This work aims to develop a secure lightweight cipher algorithm for constrained devices. A secure communication among constrained devices is a critical issue during the data transmission from the client to the server devices. Lightweight cipher algorithms are defined as a secure solution for constrained devices that require low computational functions and small memory. In contrast, most lightweight algorithms suffer from the trade-off between complexity and speed in order to produce robust cipher algorithm. The PRESENT cipher has been successfully experimented on as a lightweight cryptography algorithm, which transcends other ciphers in terms of its computational processing that required low complexity operations. The mathematical model of
... Show MoreThe efforts in designing and developing lightweight cryptography (LWC) started a decade ago. Many scholarly studies in literature report the enhancement of conventional cryptographic algorithms and the development of new algorithms. This significant number of studies resulted in the rise of many review studies on LWC in IoT. Due to the vast number of review studies on LWC in IoT, it is not known what the studies cover and how extensive the review studies are. Therefore, this article aimed to bridge the gap in the review studies by conducting a systematic scoping study. It analyzed the existing review articles on LWC in IoT to discover the extensiveness of the reviews and the topics covered. The results of the study suggested that many re
... Show MoreIn this paper, an algorithm through which we can embed more data than the
regular methods under spatial domain is introduced. We compressed the secret data
using Huffman coding and then this compressed data is embedded using laplacian
sharpening method.
We used Laplace filters to determine the effective hiding places, then based on
threshold value we found the places with the highest values acquired from these filters
for embedding the watermark. In this work our aim is increasing the capacity of
information which is to be embedded by using Huffman code and at the same time
increasing the security of the algorithm by hiding data in the places that have highest
values of edges and less noticeable.
The perform
The image caption is the process of adding an explicit, coherent description to the contents of the image. This is done by using the latest deep learning techniques, which include computer vision and natural language processing, to understand the contents of the image and give it an appropriate caption. Multiple datasets suitable for many applications have been proposed. The biggest challenge for researchers with natural language processing is that the datasets are incompatible with all languages. The researchers worked on translating the most famous English data sets with Google Translate to understand the content of the images in their mother tongue. In this paper, the proposed review aims to enhance the understanding o
... Show More