NeighShrink is an efficient image denoising algorithm based on the discrete wavelet
transform (DWT). Its disadvantage is to use a suboptimal universal threshold and identical
neighbouring window size in all wavelet subbands. Dengwen and Wengang proposed an
improved method, which can determine an optimal threshold and neighbouring window size
for every subband by the Stein’s unbiased risk estimate (SURE). Its denoising performance is
considerably superior to NeighShrink and also outperforms SURE-LET, which is an up-todate
denoising algorithm based on the SURE. In this paper different wavelet transform
families are used with this improved method, the results show that Haar wavelet has the
lowest performance among other wavelet functions. The system was implemented using
MATLAB R2010a. The average improvement in term of PSNR between Haar and other
wavelet functions is 1.37dB
The present study included the collection of fresh samples of species of genus Oxalis and examined the anatomical characteristics of the stem, scape, petiole, leaf and pedicel
The study addresses the problem of stagnation and declining economic growth rates in Arab countries since the eighties till today after the progress made by these countries in the sixties of the last century. The study reviews the e
... Show MoreAbstract
This study investigated the optimization of wear behavior of AISI 4340 steel based on the Taguchi method under various testing conditions. In this paper, a neural network and the Taguchi design method have been implemented for minimizing the wear rate in 4340 steel. A back-propagation neural network (BPNN) was developed to predict the wear rate. In the development of a predictive model, wear parameters like sliding speed, applying load and sliding distance were considered as the input model variables of the AISI 4340 steel. An analysis of variance (ANOVA) was used to determine the significant parameter affecting the wear rate. Finally, the Taguchi approach was applied to determine
... Show MoreThe study was preformed for investigating of Salmonella from meat, and compared Vidas UP Salmonella (SPT) with the traditional methods of isolation for Salmonella , were examined 42 meat samples (Beef and Chicken) from the Local and Imported From local markets in the city of Baghdad from period December 2013 -February 2014 the samples were cultured on enrichment and differential media and examined samples Vidas, and confirmed of isolates by cultivation chromgenic agar, biochemical tests ,Api20 E systeme , In addition serological tests , and the serotypes determinate in the Central Public Health Laboratory / National Institute of Salmonella The results showed the contamination in imported meat was more than in the local meat 11.9% and 2
... Show MoreIn this paper, an algorithm through which we can embed more data than the
regular methods under spatial domain is introduced. We compressed the secret data
using Huffman coding and then this compressed data is embedded using laplacian
sharpening method.
We used Laplace filters to determine the effective hiding places, then based on
threshold value we found the places with the highest values acquired from these filters
for embedding the watermark. In this work our aim is increasing the capacity of
information which is to be embedded by using Huffman code and at the same time
increasing the security of the algorithm by hiding data in the places that have highest
values of edges and less noticeable.
The perform
Research summarized in applying the model of fuzzy goal programming for aggregate production planning , in General Company for hydraulic industries / plastic factory to get an optimal production plan trying to cope with the impact that fluctuations in demand and employs all available resources using two strategies where they are available inventories strategy and the strategy of change in the level of the workforce, these strategies costs are usually imprecise/fuzzy. The plant administration trying to minimize total production costs, minimize carrying costs and minimize changes in labour levels. depending on the gained data from th
... Show MoreExamining and comparing the image quality of degenerative cervical spine diseases through the application of three MRI sequences; the Two-Dimension T2 Weighed Turbo Spin Echo (2D T2W TSE), the Three-Dimension T2 Weighted Turbo Spin Echo (3D T2W TSE), and the T2 Turbo Field Echo (T2_TFE). Thirty-three patients who were diagnosed as having degenerative cervical spine diseases were involved in this study. Their age range was 40-60 years old. The images were produced via a 1.5 Tesla MRI device using (2D T2W TSE, 3D T2W TSE, and T2_TFE) sequences in the sagittal plane. The image quality was examined by objective and subjective assessments. The MRI image characteristics of the cervical spines (C4-C5, C5-C6, C6-C7) showed significant difference
... Show MoreThis research had been achieved to identify the image of the subsurface structure representing the Tertiary period in the Galabat Field northeast of Iraq using 2D seismic survey measurements. Synthetic seismograms of the Galabat-3 well were generated in order to identify and pick the reflectors in seismic sections. Structural Images were drawn in the time domain and then converted to the depth domain by using average velocities. Structurally, seismic sections illustrate these reflectors are affected by two reverse faults affected on the Jeribe Formation and the layers below with the increase in the density of the reverse faults in the northern division. The structural maps show Galabat field, which consists of longitudinal Asymmetrical narr
... Show MoreThis research aims to solve the problem of selection using clustering algorithm, in this research optimal portfolio is formation using the single index model, and the real data are consisting from the stocks Iraqi Stock Exchange in the period 1/1/2007 to 31/12/2019. because the data series have missing values ,we used the two-stage missing value compensation method, the knowledge gap was inability the portfolio models to reduce The estimation error , inaccuracy of the cut-off rate and the Treynor ratio combine stocks into the portfolio that caused to decline in their performance, all these problems required employing clustering technic to data mining and regrouping it within clusters with similar characteristics to outperform the portfolio
... Show More