Entropy define as uncertainty measure has been transfared by using the cumulative distribution function and reliability function for the Burr type – xii. In the case of data which suffer from volatility to build a model the probability distribution on every failure of a sample after achieving limitations function, probabilistic distribution. Has been derived formula probability distribution of the new transfer application entropy on the probability distribution of continuous Burr Type-XII and tested a new function and found that it achieved the conditions function probability, been derived mean and function probabilistic aggregate in order to be approved in the generation of data for the purpose of implementation of simulation experiments. Was then estimate parameters of the probability distribution that has been extracted from the distribution formula for the function of every failure using a method as possible the greatest and the way White and the way the estimated mixed, and comparison between the adoption of the standard average squares error (MSE) to compare the results using the method of simulation in the demo to get to the advantage estimators and volumes of different samples to my teacher and measurement form of distribution. The results reveal that the mixed estimated parameter is the best form either parameter shape, and the results showed that the best estimated of scale parameters are the White estimator
In this paper, we investigate the connection between the hierarchical models and the power prior distribution in quantile regression (QReg). Under specific quantile, we develop an expression for the power parameter ( ) to calibrate the power prior distribution for quantile regression to a corresponding hierarchical model. In addition, we estimate the relation between the and the quantile level via hierarchical model. Our proposed methodology is illustrated with real data example.
In this research, we find the Bayesian formulas and the estimation of Bayesian expectation for product system of Atlas Company. The units of the system have been examined by helping the technical staff at the company and by providing a real data the company which manufacturer the system. This real data include the failed units for each drawn sample, which represents the total number of the manufacturer units by the company system. We calculate the range for each estimator by using the Maximum Likelihood estimator. We obtain that the expectation-Bayesian estimation is better than the Bayesian estimator of the different partially samples which were drawn from the product system after it checked by the
... Show More
The problem with research lies in hiding the Hanbali approach in building long and short travel provisions, as well as hiding some provisions relating to short travel that are not provided for by the jurists of Hanbali (in their books).
The research aims to demonstrate the approach and standards on which they based the long and short travel provisions, as well as to reflect the provisions of some of the issues that are silent on long and short travel, with evidence and significance.
The research included a preface and two researches, the researcher in the preface talked about the reality of long and short travel, in the first research on the approach of ha
... Show MoreEconomic performance is one of the most important indicators of economic activity and with the performance of the economy progress varied sources of output and increase economic growth rates and per capita national income, and to recover the business environment and increase investment rates and rising effectiveness of the financial and monetary institutions and credit market. Which leads to increased employment rates and reducing unemployment rates and the elimination of many of the social problems and improve the average per capita income as well as improve the level of national income.
The input / output tables is a technique mathematical indicates economic performance
... Show MoreThe aim of the present study was to distinguish between healthy children and those with epilepsy by electroencephalography (EEG). Two biomarkers including Hurst exponents (H) and Tsallis entropy (TE) were used to investigate the background activity of EEG of 10 healthy children and 10 with epilepsy. EEG artifacts were removed using Savitzky-Golay (SG) filter. As it hypothesize, there was a significant changes in irregularity and complexity in epileptic EEG in comparison with healthy control subjects using t-test (p< 0.05). The increasing in complexity changes were observed in H and TE results of epileptic subjects make them suggested EEG biomarker associated with epilepsy and a reliable tool for detection and identification of this di
... Show MoreA group of acceptance sampling to testing the products was designed when the life time of an item follows a log-logistics distribution. The minimum number of groups (k) required for a given group size and acceptance number is determined when various values of Consumer’s Risk and test termination time are specified. All the results about these sampling plan and probability of acceptance were explained with tables.
Preserving the Past and Building the Future: A Sustainable Urban Plan for Mosul, Iraq
the banks are one of the public services that must be available in the city to ensure easy financial dealings between citizens and state departments and between the state departments with each other and between the citizens themselves and to ensure easy access to it, so it is very important to choose the best location for the bank, which can serve the largest number of The population achieves easy access. Due to the difficulty of obtaining accurate information dealing with the exact coordinates and according to the country's specific projection, the researcher will resort to the default work using some of the files available in the arcview program
A new algorithm is proposed to compress speech signals using wavelet transform and linear predictive coding. Signal compression based on the concept of selecting a small number of approximation coefficients after they are compressed by the wavelet decomposition (Haar and db4) at a suitable chosen level and ignored details coefficients, and then approximation coefficients are windowed by a rectangular window and fed to the linear predictor. Levinson Durbin algorithm is used to compute LP coefficients, reflection coefficients and predictor error. The compress files contain LP coefficients and previous sample. These files are very small in size compared to the size of the original signals. Compression ratio is calculated from the size of th
... Show More