Multiple linear regressions are concerned with studying and analyzing the relationship between the dependent variable and a set of explanatory variables. From this relationship the values of variables are predicted. In this paper the multiple linear regression model and three covariates were studied in the presence of the problem of auto-correlation of errors when the random error distributed the distribution of exponential. Three methods were compared (general least squares, M robust, and Laplace robust method). We have employed the simulation studies and calculated the statistical standard mean squares error with sample sizes (15, 30, 60, 100). Further we applied the best method on the real experiment data representing the varieties of cigarettes according to the US Federal Trade Commission.
One of the most important elements of achieving food security is livestock, which is an essential element in the agricultural sector, and is one of the state support sectors. Animal production (sheep) ranked an important position in this sector due to the economic advantages that are available when rearing. Moreover, the success and development of sheep breeding depend on several factors, including financial return and achieving profitability. The study aims to identify the phenomenon size of random slaughter as a problem, which spread in Baghdad and its causes and the factors that influencing its development. As well as, the possibility of applying the idea of amobile slaughterhouse to reduce this phenomen
... Show MoreLinear discriminant analysis and logistic regression are the most widely used in multivariate statistical methods for analysis of data with categorical outcome variables .Both of them are appropriate for the development of linear classification models .linear discriminant analysis has been that the data of explanatory variables must be distributed multivariate normal distribution. While logistic regression no assumptions on the distribution of the explanatory data. Hence ,It is assumed that logistic regression is the more flexible and more robust method in case of violations of these assumptions.
In this paper we have been focus for the comparison between three forms for classification data belongs
... Show MoreThis research is carried out to investigate the behavior of self-compacting concrete (SCC) two-way slabs with central square opening under uniformly distributed loads. The experimental part of this research is based on casting and testing six SCC simply supported square slabs having the same dimentions and reinforcement. One of these slabs was cast without opening as a control slab. While, the other five slabs having opening ratios (OR) of 2.78%, 6.25%, 11.11%, 17.36% and 25.00%. From the experimental results it is found that the maximum percentage decrease in cracking and ultimate uniform loads were 31.82% and 12.17% compared to control slab for opening ratios (OR
... Show MoreThis work, deals with Kumaraswamy distribution. Kumaraswamy (1976, 1978) showed well known probability distribution functions such as the normal, beta and log-normal but in (1980) Kumaraswamy developed a more general probability density function for double bounded random processes, which is known as Kumaraswamy’s distribution. Classical maximum likelihood and Bayes methods estimator are used to estimate the unknown shape parameter (b). Reliability function are obtained using symmetric loss functions by using three types of informative priors two single priors and one double prior. In addition, a comparison is made for the performance of these estimators with respect to the numerical solution which are found using expansion method. The
... Show MoreIn this paper, we will provide a proposed method to estimate missing values for the Explanatory variables for Non-Parametric Multiple Regression Model and compare it with the Imputation Arithmetic mean Method, The basis of the idea of this method was based on how to employ the causal relationship between the variables in finding an efficient estimate of the missing value, we rely on the use of the Kernel estimate by Nadaraya – Watson Estimator , and on Least Squared Cross Validation (LSCV) to estimate the Bandwidth, and we use the simulation study to compare between the two methods.
This deals with estimation of Reliability function and one shape parameter (?) of two- parameters Burr – XII , when ?(shape parameter is known) (?=0.5,1,1.5) and also the initial values of (?=1), while different sample shze n= 10, 20, 30, 50) bare used. The results depend on empirical study through simulation experiments are applied to compare the four methods of estimation, as well as computing the reliability function . The results of Mean square error indicates that Jacknif estimator is better than other three estimators , for all sample size and parameter values
This study investigates the implementation of Taguchi design in the estimation of minimum corrosion rate of mild-steel in cooling tower that uses saline solution of different concentration. The experiments were set on the basis of Taguchi’s L16 orthogonal array. The runs were carried out under different condition such as inlet concentration of saline solution, temperature, and flowrate. The Signal-to- Noise ratio and ANOVA analysis were used to define the impact of cooling tower working conditions on the corrosion rate. A regression had been modelled and optimized to identify the optimum level for the working parameters that had been founded to be 13%NaCl, 35ᴼC, and 1 l/min. Also a confirmation run to establish the p
... Show MoreError control schemes became a necessity in network-on-chip (NoC) to improve reliability as the on-chip interconnect errors increase with the continuous shrinking of geometry. Accordingly, many researchers are trying to present multi-bit error correction coding schemes that perform a high error correction capability with the simplest design possible to minimize area and power consumption. A recent work, Multi-bit Error Correcting Coding with Reduced Link Bandwidth (MECCRLB), showed a huge reduction in area and power consumption compared to a well-known scheme, namely, Hamming product code (HPC) with Type-II HARQ. Moreover, the authors showed that the proposed scheme can correct 11 random errors which is considered a high
... Show MoreA Longitudinal opening is used to construct hollow core beam is a cast in site or precast or pre stressed concrete member with continuous voids provided to reduce weight, cost and, as a side benefit, to use for concealed electrical or mechanical runs. Primarily is used as floor beams or roof deck systems. This study investigate the behavior of six beams (solid or with opening) of dimension (length 1000 x height 180 x width120mm) simply support under partial uniformly distributed load, four of these beam contain long opening of varied section (40x40mm) or (80x40mm). The effect of vertical steel reinforcing, opening size and orientations are investigated to evaluate the response of beams. The experimental behavior based on load-deflection
... Show MoreThe partial level density PLD of pre-equilibrium reactions that are described by Ericson’s formula has been studied using different formulae of single particle level density . The parameter was used from the equidistant spacing model (ESM) model and the non- equidistant spacing model (non-ESM) and another formula of are derived from the relation between and level density parameter . The formulae used to derive are the Roher formula, Egidy formula, Yukawa formula, and Thomas –Fermi formula. The partial level density results that depend on from the Thomas-Fermi formula show a good agreement with the experimental data.