The - mixing ratios of -transitions from levels in populated in the reactions are calculated in present work using - ratio, constant statisticalTensor and least squares fitting methods The results obtained are in general, in good agreement or consistent, within the associated uncertainties, with these reported in Ref.[9],the discrepancies that occurs are due to inaccuracy existing in the experimental data The results obtained in the present work confirm the –method for mixed transitions better than that for pure transition because this method depends only on the experimental data where the second method depends on the pure or those considered to be pure -transitions, the same results occur in – method
In this work, results from an optical technique (laser speckle technique) for measuring surface roughness was done by using statistical properties of speckle pattern from the point of view of computer image texture analysis. Four calibration relationships were used to cover wide range of measurement with the same laser speckle technique. The first one is based on intensity contrast of the speckle, the second is based on analysis of speckle binary image, the third is on size of speckle pattern spot, and the latest one is based on characterization of the energy feature of the gray level co-occurrence matrices for the speckle pattern. By these calibration relationships surface roughness of an object surface can be evaluated within the
... Show MoreTo achieve sustainability, use waste materials to make concrete to use alternative components and reduce the production of Portland cement. Lime cement was used instead of Portland cement, and 15% of the cement's weight was replaced with silica fume. Also used were eco-friendly fibers (copper fiber) made from recycled electrical. This work examines the impact of utilizing sustainable copper fiber with different aspect ratios (l/d) on some mechanical properties of high-strength green concrete. A high-strength cement mixture with a compressive strength of 65 MPa in line with ACI 211.4R was required to complete the assignment. Copper fibers of 1% by volume of concrete were employed in mixes with four different aspect ratios
... Show MoreSemi-parametric models analysis is one of the most interesting subjects in recent studies due to give an efficient model estimation. The problem when the response variable has one of two values either 0 ( no response) or one – with response which is called the logistic regression model.
We compare two methods Bayesian and . Then the results were compared using MSe criteria.
A simulation had been used to study the empirical behavior for the Logistic model , with different sample sizes and variances. The results using represent that the Bayesian method is better than the at small samples sizes.
... Show MoreThe question of estimation took a great interest in some engineering, statistical applications, various applied, human sciences, the methods provided by it helped to identify and accurately the many random processes.
In this paper, methods were used through which the reliability function, risk function, and estimation of the distribution parameters were used, and the methods are (Moment Method, Maximum Likelihood Method), where an experimental study was conducted using a simulation method for the purpose of comparing the methods to show which of these methods are competent in practical application This is based on the observations generated from the Rayleigh logarithmic distribution (RL) with sample sizes
... Show MoreThis paper delves into some significant performance measures (PMs) of a bulk arrival queueing system with constant batch size b, according to arrival rates and service rates being fuzzy parameters. The bulk arrival queuing system deals with observation arrival into the queuing system as a constant group size before allowing individual customers entering to the service. This leads to obtaining a new tool with the aid of generating function methods. The corresponding traditional bulk queueing system model is more convenient under an uncertain environment. The α-cut approach is applied with the conventional Zadeh's extension principle (ZEP) to transform the triangular membership functions (Mem. Fs) fuzzy queues into a family of conventional b
... Show MoreIn this research, the stopping power and range of protons in biological human soft and hard tissues (blood, brain, skeleton-cortical bone, and skin) of both child and adult are calculated at the energies ranging from 1MeV to 350 MeV. The data is collected from ICRU Report 46 and calculated the stopping power employing the Bethe formula. Moreover, the simple integration (continuous slowing down approximation) method is employed for calculating protons range at the target. Then, the stopping power and range of protons value in human tissues have been compared with the program called SRIM. Moreover, the results of the stopping power vs energy and the range vs energy have been presented graphically. Proper agreement is found between the gain
... Show MoreThe auditory system can suffer from exposure to loud noise and human health can be affected. Traffic noise is a primary contributor to noise pollution. To measure the noise levels, 3 variables were examined at 25 locations. It was found that the main factors that determine the increase in noise level are traffic volume, vehicle speed, and road functional class. The data have been taken during three different periods per day so that they represent and cover the traffic noise of the city during heavy traffic flow conditions. Analysis of traffic noise prediction was conducted using a simple linear regression model to accurately predict the equivalent continuous sound level. The difference between the predicted and the measured noise shows that
... Show MoreThe Dirichlet process is an important fundamental object in nonparametric Bayesian modelling, applied to a wide range of problems in machine learning, statistics, and bioinformatics, among other fields. This flexible stochastic process models rich data structures with unknown or evolving number of clusters. It is a valuable tool for encoding the true complexity of real-world data in computer models. Our results show that the Dirichlet process improves, both in distribution density and in signal-to-noise ratio, with larger sample size; achieves slow decay rate to its base distribution; has improved convergence and stability; and thrives with a Gaussian base distribution, which is much better than the Gamma distribution. The performance depen
... Show More
