This paper deals with, Bayesian estimation of the parameters of Gamma distribution under Generalized Weighted loss function, based on Gamma and Exponential priors for the shape and scale parameters, respectively. Moment, Maximum likelihood estimators and Lindley’s approximation have been used effectively in Bayesian estimation. Based on Monte Carlo simulation method, those estimators are compared in terms of the mean squared errors (MSE’s).
In this paper, an estimate has been made for parameters and the reliability function for Transmuted power function (TPF) distribution through using some estimation methods as proposed new technique for white, percentile, least square, weighted least square and modification moment methods. A simulation was used to generate random data that follow the (TPF) distribution on three experiments (E1 , E2 , E3) of the real values of the parameters, and with sample size (n=10,25,50 and 100) and iteration samples (N=1000), and taking reliability times (0< t < 0) . Comparisons have been made between the obtained results from the estimators using mean square error (MSE). The results showed the
... Show MoreThe analysis of survival and reliability considered of topics and methods of vital statistics at the present time because of their importance in the various demographical, medical, industrial and engineering fields. This research focused generate random data for samples from the probability distribution Generalized Gamma: GG, known as: "Inverse Transformation" Method: ITM, which includes the distribution cycle integration function incomplete Gamma integration making it more difficult classical estimation so will be the need to illustration to the method of numerical approximation and then appreciation of the function of survival function. It was estimated survival function by simulation the way "Monte Carlo". The Entropy method used for the
... Show MoreTo ascertain the stability or instability of time series, three versions of the model proposed by Dickie-Voller were used in this paper. The aim of this study is to explain the extent of the impact of some economic variables such as the supply of money, gross domestic product, national income, after reaching the stability of these variables. The results show that the variable money supply, the GDP variable, and the exchange rate variable were all stable at the level of the first difference in the time series. This means that the series is an integrated first-class series. Hence, the gross fixed capital formation variable, the variable national income, and the variable interest rate
... Show MoreThe utilization of carbon dioxide (CO₂) to enhance wellbore injectivity presents a cost-effective and sustainable strategy for mitigating greenhouse gas emissions while improving reservoir performance. This study introduces an environmentally friendly method employing a water-soluble chitosan salt (CS) that generates a carbonated-rich acid solution upon contact with dry CO₂ at 25 °C and 508 psi. CS solutions (100–2000 ppm) were prepared and evaluated for CO₂ uptake, acid generation, and rheological behavior. Results show that 1000 ppm achieves an optimal CO2 uptake (2612 mg/l), with moderate viscosity increase (from 1.52 to 3.37 cp), while higher concentrations exhibit a sharp rise due to polymer-like network formation. Core floodi
... Show MoreLowpass spatial filters are adopted to match the noise statistics of the degradation seeking
good quality smoothed images. This study imply different size and shape of smoothing
windows. The study shows that using a window square frame shape gives good quality
smoothing and at the same time preserving a certain level of high frequency components in
comparsion with standard smoothing filters.
Object tracking is one of the most important topics in the fields of image processing and computer vision. Object tracking is the process of finding interesting moving objects and following them from frame to frame. In this research, Active models–based object tracking algorithm is introduced. Active models are curves placed in an image domain and can evolve to segment the object of interest. Adaptive Diffusion Flow Active Model (ADFAM) is one the most famous types of Active Models. It overcomes the drawbacks of all previous versions of the Active Models specially the leakage problem, noise sensitivity, and long narrow hols or concavities. The ADFAM is well known for its very good capabilities in the segmentation process. In this
... Show MoreAcceptable Bit Error rate can be maintained by adapting some of the design parameters such as modulation, symbol rate, constellation size, and transmit power according to the channel state.
An estimate of HF propagation effects can be used to design an adaptive data transmission system over HF link. The proposed system combines the well known Automatic Link Establishment (ALE) together with variable rate transmission system. The standard ALE is modified to suite the required goal of selecting the best carrier frequency (channel) for a given transmission. This is based on measuring SINAD (Signal plus Noise plus Distortion to Noise plus Distortion), RSL (Received Signal Level), multipath phase distortion and BER (Bit Error Rate) fo
... Show MoreThe theory of general topology view for continuous mappings is general version and is applied for topological graph theory. Separation axioms can be regard as tools for distinguishing objects in information systems. Rough theory is one of map the topology to uncertainty. The aim of this work is to presented graph, continuity, separation properties and rough set to put a new approaches for uncertainty. For the introduce of various levels of approximations, we introduce several levels of continuity and separation axioms on graphs in Gm-closure approximation spaces.