Objectives Bromelain is a potent proteolytic enzyme that has a unique functionality makes it valuable for various therapeutic purposes. This study aimed to develop three novel formulations based on bromelain to be used as chemomechanical caries removal agents. Methods The novel agents were prepared using different concentrations of bromelain (10–40 wt. %), with and without 0.1–0.3 wt. % chloramine T or 0.5–1.5 wt. % chlorhexidine (CHX). Based on the enzymatic activity test, three formulations were selected; 30 % bromelain (F1), 30 % bromelain-0.1 % chloramine (F2) and 30 % bromelain-1.5 % CHX (F3). The assessments included molecular docking, Fourier-transform infrared spectroscopy (FTIR), viscosity and pH measurements. The efficiency
... Show MoreIn regression testing, Test case prioritization (TCP) is a technique to arrange all the available test cases. TCP techniques can improve fault detection performance which is measured by the average percentage of fault detection (APFD). History-based TCP is one of the TCP techniques that consider the history of past data to prioritize test cases. The issue of equal priority allocation to test cases is a common problem for most TCP techniques. However, this problem has not been explored in history-based TCP techniques. To solve this problem in regression testing, most of the researchers resort to random sorting of test cases. This study aims to investigate equal priority in history-based TCP techniques. The first objective is to implement
... Show MoreObjectives: Bromelain is a potent proteolytic enzyme that has a unique functionality makes it valuable for various therapeutic purposes. This study aimed to develop three novel formulations based on bromelain to be used as chemomechanical caries removal agents. Methods: The novel agents were prepared using different concentrations of bromelain (10–40 wt. %), with and without 0.1–0.3 wt. % chloramine T or 0.5–1.5 wt. % chlorhexidine (CHX). Based on the enzymatic activity test, three formulations were selected; 30 % bromelain (F1), 30 % bromelain-0.1 % chloramine (F2) and 30 % bromelain-1.5 % CHX (F3). The assessments included molecular docking, Fourier-transform infrared spectroscopy (FTIR), viscosity and pH measurements. The efficie
... Show MoreAkaike’s Information Criterion (AIC) is a popular method for estimation the number of sources impinging on an array of sensors, which is a problem of great interest in several applications. The performance of AIC degrades under low Signal-to-Noise Ratio (SNR). This paper is concerned with the development and application of quadrature mirror filters (QMF) for improving the performance of AIC. A new system is proposed to estimate the number of sources by applying AIC to the outputs of filter bank consisting quadrature mirror filters (QMF). The proposed system can estimate the number of sources under low signal-to-noise ratio (SNR).
This paper presents an improved technique on Ant Colony Optimization (ACO) algorithm. The procedure is applied on Single Machine with Infinite Bus (SMIB) system with power system stabilizer (PSS) at three different loading regimes. The simulations are made by using MATLAB software. The results show that by using Improved Ant Colony Optimization (IACO) the system will give better performance with less number of iterations as it compared with a previous modification on ACO. In addition, the probability of selecting the arc depends on the best ant performance and the evaporation rate.
The introduction of concrete damage plasticity material models has significantly improved the accuracy with which the concrete structural elements can be predicted in terms of their structural response. Research into this method's accuracy in analyzing complex concrete forms has been limited. A damage model combined with a plasticity model, based on continuum damage mechanics, is recommended for effectively predicting and simulating concrete behaviour. The damage parameters, such as compressive and tensile damages, can be defined to simulate concrete behavior in a damaged-plasticity model accurately. This research aims to propose an analytical model for assessing concrete compressive damage based on stiffness deterioration. The prop
... Show MoreResearchers dream of developing autonomous humanoid robots which behave/walk like a human being. Biped robots, although complex, have the greatest potential for use in human-centred environments such as the home or office. Studying biped robots is also important for understanding human locomotion and improving control strategies for prosthetic and orthotic limbs. Control systems of humans walking in cluttered environments are complex, however, and may involve multiple local controllers and commands from the cerebellum. Although biped robots have been of interest over the last four decades, no unified stability/balance criterion adopted for stabilization of miscellaneous walking/running modes of biped
The objective of this study was tointroduce a recursive least squares (RLS) parameter estimatorenhanced by using a neural network (NN) to facilitate the computing of a bit error rate (BER) (error reduction) during channels estimation of a multiple input-multiple output orthogonal frequency division multiplexing (MIMO-OFDM) system over a Rayleigh multipath fading channel.Recursive least square is an efficient approach to neural network training:first, the neural network estimator learns to adapt to the channel variations then it estimates the channel frequency response. Simulation results show that the proposed method has better performance compared to the conventional methods least square (LS) and the original RLS and it is more robust a
... Show MoreImage compression is a serious issue in computer storage and transmission, that simply makes efficient use of redundancy embedded within an image itself; in addition, it may exploit human vision or perception limitations to reduce the imperceivable information Polynomial coding is a modern image compression technique based on modelling concept to remove the spatial redundancy embedded within the image effectively that composed of two parts, the mathematical model and the residual. In this paper, two stages proposed technqies adopted, that starts by utilizing the lossy predictor model along with multiresolution base and thresholding techniques corresponding to first stage. Latter by incorporating the near lossless com
... Show More