Image compression is a serious issue in computer storage and transmission, that simply makes efficient use of redundancy embedded within an image itself; in addition, it may exploit human vision or perception limitations to reduce the imperceivable information Polynomial coding is a modern image compression technique based on modelling concept to remove the spatial redundancy embedded within the image effectively that composed of two parts, the mathematical model and the residual. In this paper, two stages proposed technqies adopted, that starts by utilizing the lossy predictor model along with multiresolution base and thresholding techniques corresponding to first stage. Latter by incorporating the near lossless compression scheme of first stage that corresponding to second stage. The tested results shown are promising in both two stages, that implicilty enhanced the performance of traditional polynomial model in terms of compression ratio , and preresving image quality.
Background: Endodontically treated teeth have low resistance to fracture against occlusal forces. The strengthening effect of bonded esthetic onlay restoration on weakened tooth has been reported. This study aimed to assess the fracture resistance of endodontically treated premolars restored with composite with and without cuspal coverage by using direct and indirect techniques. Indirect technique done by CAD/CAM system (computer aided design –computer aided manufacturer) and laboratory processing. Material and methods: Forty human extracted maxillary premolars of approximately comparable sizes were divided into four groups: Group (A): Ten endodontically treated teeth directly filled with Filtek Z250xt without cuspal coverage. Group
... Show MoreThe data preprocessing step is an important step in web usage mining because of the nature of log data, which are heterogeneous, unstructured, and noisy. Given the scalability and efficiency of algorithms in pattern discovery, a preprocessing step must be applied. In this study, the sequential methodologies utilized in the preprocessing of data from web server logs, with an emphasis on sub-phases, such as session identification, user identification, and data cleansing, are comprehensively evaluated and meticulously examined.
The present work aims to study the effect of using an automatic thresholding technique to convert the features edges of the images to binary images in order to split the object from its background, where the features edges of the sampled images obtained from first-order edge detection operators (Roberts, Prewitt and Sobel) and second-order edge detection operators (Laplacian operators). The optimum automatic threshold are calculated using fast Otsu method. The study is applied on a personal image (Roben) and a satellite image to study the compatibility of this procedure with two different kinds of images. The obtained results are discussed.
The laser micro-cutting process is the most widely commonly applied machining process which can be applied to practically all metallic and non-metallic materials. While this had challenges in cutting quality criteria such as geometrical precision, surface quality and numerous others. This article investigates the laser micro-cutting of PEEK composite material using nano-fiber laser, due to their significant importunity and efficiency of laser in various manufacturing processes. Design of experiential tool based on Response Surface Methodology (RSM)-Central Composite Design (CCD) used to generate the statistical model. This method was employed to analysis the influence of parameters including laser speed,
... Show MoreIn this study, the quality assurance of the linear accelerator available at the Baghdad Center for Radiation Therapy and Nuclear Medicine was verified using Star Track and Perspex. The study was established from August to December 2018. This study showed that there was an acceptable variation in the dose output of the linear accelerator. This variation was ±2% and it was within the permissible range according to the recommendations of the manufacturer of the accelerator (Elkta).
The research dealt with a comparative study between some semi-parametric estimation methods to the Partial linear Single Index Model using simulation. There are two approaches to model estimation two-stage procedure and MADE to estimate this model. Simulations were used to study the finite sample performance of estimating methods based on different Single Index models, error variances, and different sample sizes , and the mean average squared errors were used as a comparison criterion between the methods were used. The results showed a preference for the two-stage procedure depending on all the cases that were used