In this article, we developed a new loss function, as the simplification of linear exponential loss function (LINEX) by weighting LINEX function. We derive a scale parameter, reliability and the hazard functions in accordance with upper record values of the Lomax distribution (LD). To study a small sample behavior performance of the proposed loss function using a Monte Carlo simulation, we make a comparison among maximum likelihood estimator, Bayesian estimator by means of LINEX loss function and Bayesian estimator using square error loss (SE) function. The consequences have shown that a modified method is the finest for valuing a scale parameter, reliability and hazard functions.
In this paper, the maximum likelihood estimates for parameter ( ) of two parameter's Weibull are studied, as well as white estimators and (Bain & Antle) estimators, also Bayes estimator for scale parameter ( ), the simulation procedures are used to find the estimators and comparing between them using MSE. Also the application is done on the data for 20 patients suffering from a headache disease.
This paper assesses the impact of changes and fluctuations in bank deposits on the money supply in Iraq. Employing the research constructs an Error Correction Model (ECM) using monthly time series data from 2010 to 2015. The analysis begins with the Phillips-Perron unit root test to ascertain the stationarity of the time series and the Engle and Granger cointegration test to examine the existence of a long-term relationship. Nonparametric regression functions are estimated using two methods: Smoothing Spline and M-smoothing. The results indicate that the M-smoothing approach is the most effective, achieving the shortest adjustment period and the highest adjustment ratio for short-term disturbances, thereby facilitating a return
... Show MoreSpecimens of the sesarmid crab Nanonsesarma sarii (Naderloo and Türkay 2009) were collected from the intertidal zone of Khor Al-Zubair, Basrah, Iraq 2012 far from the Arabian Gulf coasts. Morphological features of this species are highlighted and a figure is provided.
Background: This in vitro study measure and compare the effect of light curing tip distance on the depth of cure by measuring vickers microhardness value on two recently launched bulk fill resin based composites Tetric EvoCeram Bulk Fill and Surefil SDR Flow with 4 mm thickness in comparison to Filtek Z250 Universal Restorative with 2 mm thickness. In addition, measure and compare the bottom to top microhardness ratio with different light curing tip distances. Materials and Method: One hundred fifty composite specimens were obtained from two cylindrical plastic molds the first one for bulk fill composites (Tetric EvoCeram Bulk Fill and Surefil SDR Flow) with 4 mm diameter and 4 mm depth, the second one for Filtek Z250 Universal Restorative
... Show MoreThe aim of the current research is to reveal the effect of using brain-based learning theory strategies on the achievement of Art Education students in the subject of Teaching Methods. The experimental design with two equal experimental and control groups was used. The experimental design with two independent and equal groups was used, and the total of the research sample was (60) male and female students, (30) male and female students represented the experimental group, and (30) male and female students represented the control group. The researcher prepared the research tool represented by the cognitive achievement test consisting of (20) questions, and it was characterized by honesty and reliability, and the experiment lasted (6) weeks
... Show MoreFeatures is the description of the image contents which could be corner, blob or edge. Corners are one of the most important feature to describe image, therefore there are many algorithms to detect corners such as Harris, FAST, SUSAN, etc. Harris is a method for corner detection and it is an efficient and accurate feature detection method. Harris corner detection is rotation invariant but it isn’t scale invariant. This paper presents an efficient harris corner detector invariant to scale, this improvement done by using gaussian function with different scales. The experimental results illustrate that it is very useful to use Gaussian linear equation to deal with harris weakness.
In this paper, a fast lossless image compression method is introduced for compressing medical images, it is based on splitting the image blocks according to its nature along with using the polynomial approximation to decompose image signal followed by applying run length coding on the residue part of the image, which represents the error caused by applying polynomial approximation. Then, Huffman coding is applied as a last stage to encode the polynomial coefficients and run length coding. The test results indicate that the suggested method can lead to promising performance.
In this work we present a technique to extract the heart contours from noisy echocardiograph images. Our technique is based on improving the image before applying contours detection to reduce heavy noise and get better image quality. To perform that, we combine many pre-processing techniques (filtering, morphological operations, and contrast adjustment) to avoid unclear edges and enhance low contrast of echocardiograph images, after implementing these techniques we can get legible detection for heart boundaries and valves movement by traditional edge detection methods.