The stress(Y) – strength(X) model reliability Bayesian estimation which defines life of a component with strength X and stress Y (the component fails if and only if at any time the applied stress is greater than its strength) has been studied, then the reliability; R=P(Y<X), can be considered as a measure of the component performance. In this paper, a Bayesian analysis has been considered for R when the two variables X and Y are independent Weibull random variables with common parameter α in order to study the effect of each of the two different scale parameters β and λ; respectively, using three different [weighted, quadratic and entropy] loss functions under two different prior functions [Gamma and extension of Jeffery] and also an empirical Bayes estimator Using Gamma Prior, for singly type II censored sample. An empirical study has been used to make a comparison between the three estimators of the reliability for stress – strength Weibull model, by mean squared error MSE criteria, taking different sample sizes (small, moderate and large) for the two random variables in eight experiments of different values of their parameters. It has been found that the weighted loss function was the best for small sample size, and the entropy and Quadratic were the best for moderate and large sample sizes under the two prior distributions and for empirical Bayes estimation.
The objective of the study is to demonstrate the predictive ability is better between the logistic regression model and Linear Discriminant function using the original data first and then the Home vehicles to reduce the dimensions of the variables for data and socio-economic survey of the family to the province of Baghdad in 2012 and included a sample of 615 observation with 13 variable, 12 of them is an explanatory variable and the depended variable is number of workers and the unemployed.
Was conducted to compare the two methods above and it became clear by comparing the logistic regression model best of a Linear Discriminant function written
... Show MoreIn this work, an explicit formula for a class of Bi-Bazilevic univalent functions involving differential operator is given, as well as the determination of upper bounds for the general Taylor-Maclaurin coefficient of a functions belong to this class, are established Faber polynomials are used as a coordinated system to study the geometry of the manifold of coefficients for these functions. Also determining bounds for the first two coefficients of such functions.
In certain cases, our initial estimates improve some of the coefficient bounds and link them to earlier thoughtful results that are published earlier.
Abstract\
In this research, estimated the reliability of water system network in Baghdad was done. to assess its performance during a specific period. a fault tree through static and dynamic gates was belt and these gates represent logical relationships between the main events in the network and analyzed using dynamic Bayesian networks . As it has been applied Dynamic Bayesian networks estimate reliability by translating dynamic fault tree to Dynamic Bayesian networks and reliability of the system appreciated. As was the potential for the expense of each phase of the network for each gate . Because there are two parts to the Dynamic Bayesian networks and two part of gate (AND), which includes the three basic units of the
... Show MoreIn this paper, we derived an estimators and parameters of Reliability and Hazard function of new mix distribution ( Rayleigh- Logarithmic) with two parameters and increasing failure rate using Bayes Method with Square Error Loss function and Jeffery and conditional probability random variable of observation. The main objective of this study is to find the efficiency of the derived of Bayesian estimator compared to the to the Maximum Likelihood of this function using Simulation technique by Monte Carlo method under different Rayleigh- Logarithmic parameter and sample sizes. The consequences have shown that Bayes estimator has been more efficient than the maximum likelihood estimator in all sample sizes with application
In this study, the stress-strength model R = P(Y < X < Z) is discussed as an important parts of reliability system by assuming that the random variables follow Invers Rayleigh Distribution. Some traditional estimation methods are used to estimate the parameters namely; Maximum Likelihood, Moment method, and Uniformly Minimum Variance Unbiased estimator and Shrinkage estimator using three types of shrinkage weight factors. As well as, Monte Carlo simulation are used to compare the estimation methods based on mean squared error criteria.
A novel metal complexes Cu (II), Co (II), Cd (II), Ru (III) from azo ligand 5-((2-(1H-indol-2-yl)
ethyl) diazinyl)-2-aminophenol were synthesized by simple substitution of tryptamine with 2-aminophenol.
Structures of all the newly synthesized compounds were characterized by FT IR, UV-Vis, Mass spectroscopy
and elemental analysis. In addition measurements of magnetic moments, molar conductance and atomic
absorption. Then study their thermal stability by using TGA and DSC curves. The DCS curve was used to
calculate the thermodynamic parameters ΔH, ΔS and Δ G. Analytical information showed that all complexes
achieve a metal:ligand ratio of [1:1]. In all complex examinations, the Ligand performs as a tri
The Weibull distribution is considered one of the Type-I Generalized Extreme Value (GEV) distribution, and it plays a crucial role in modeling extreme events in various fields, such as hydrology, finance, and environmental sciences. Bayesian methods play a strong, decisive role in estimating the parameters of the GEV distribution due to their ability to incorporate prior knowledge and handle small sample sizes effectively. In this research, we compare several shrinkage Bayesian estimation methods based on the squared error and the linear exponential loss functions. They were adopted and compared by the Monte Carlo simulation method. The performance of these methods is assessed based on their accuracy and computational efficiency in estimati
... Show MoreAbstract:
This research aims to compare Bayesian Method and Full Maximum Likelihood to estimate hierarchical Poisson regression model.
The comparison was done by simulation using different sample sizes (n = 30, 60, 120) and different Frequencies (r = 1000, 5000) for the experiments as was the adoption of the Mean Square Error to compare the preference estimation methods and then choose the best way to appreciate model and concluded that hierarchical Poisson regression model that has been appreciated Full Maximum Likelihood Full Maximum Likelihood with sample size (n = 30) is the best to represent the maternal mortality data after it has been reliance value param
... Show MoreLiquefied petroleum gas (LPG), Natural gas (NG) and hydrogen were all used to operate spark ignition internal combustion engine Ricardo E6. A comparison of CO emissions emitted from each case, with emissions emitted from engine fueled with gasoline as a fuel is conducted.
The study was accomplished when engine operated at HUCR for gasoline n(8:1), was compared with its operation at HUCR for each fuel. Compression ratio, equivalence ratio and spark timing were studied at constant speed 1500 rpm.
CO concentrations were little at lean ratios; it appeared to be effected a little with equivalence ratio in this side, at rich side its values became higher, and it appeared to be effected by equivalence ratio highly, the results s
... Show MoreIn data mining, classification is a form of data analysis that can be used to extract models describing important data classes. Two of the well known algorithms used in data mining classification are Backpropagation Neural Network (BNN) and Naïve Bayesian (NB). This paper investigates the performance of these two classification methods using the Car Evaluation dataset. Two models were built for both algorithms and the results were compared. Our experimental results indicated that the BNN classifier yield higher accuracy as compared to the NB classifier but it is less efficient because it is time-consuming and difficult to analyze due to its black-box implementation.