Continuous improvement, or Kaizen, is a philosophy that is based on the idea of continuously finding ways to improve things. From this point of view, continuous improvement is not limited to the quality of products or services but it also applies to all the processes in the organization. During the last two decades several continuous improvement approaches were developed and marketed. The advocates of each approach claim that their approach is the best, however all the approaches had their own advantages and disadvantages and had their share of criticism. The important question is how to choose the right continuous improvement approach? This research work addresses the philosophy, the concepts, the assumptions, and the domain of each approach and compares them thoroughly. In addition, we present a theoretical framework to assist the organization in choosing one of the continuous improvement approaches that corresponds to the organization's culture and quality problems.
A new design of manifold flow injection (FI) coupling with a merging zone technique was studied for sulfamethoxazole determination spectrophotometrically. The semiautomated FI method has many advantages such as being fast, simple, highly accurate, economical with high throughput . The suggested method based on the production of the orange- colored compound of SMZ with (NQS)1,2-Naphthoquinone-4-Sulphonic acid Sodium salt in alkaline media NaOH at λmax 496nm.The linearity range of sulfamethoxazole was 3-100 μg. mL-1, with (LOD) was 0.593 μg. mL-1 and the RSD% is about 1.25 and the recovery is 100.73%. All various physical and chemical parameters that have an effect on the stability and development of
... Show MoreThis study was undertaken to introduce a fast, accurate, selective, simple and environment-friendly colorimetric method to determine iron (II) concentration in different lipstick brands imported or manufactured locally in Baghdad, Iraq. The samples were collected from 500-Iraqi dinars stores to establish routine tests using the spectrophotometric method and compared with a new microfluidic paper-based analytical device (µPAD) platform as an alternative to cost-effective conventional instrumentation such as Atomic Absorption Spectroscopy (AAS). This method depends on the reaction between iron (II) with iron(II) selective chelator 1, 10-phenanthroline(phen) in the presence of reducing agent hydroxylamine (HOA) and sodium acetate (NaOAc) b
... Show MoreBeta Distribution
Abstract
Gamma and Beta Distributions has very important in practice in various areas of statistical and applications reliability and quality control of production. and There are a number of methods to generate data behave on according to these distribution. and These methods bassic primarily on the shape parameters of each distribution and the relationship between these distributions and their relationship with some other probability distributions. &nb
... Show MoreA mixture model is used to model data that come from more than one component. In recent years, it became an effective tool in drawing inferences about the complex data that we might come across in real life. Moreover, it can represent a tremendous confirmatory tool in classification observations based on similarities amongst them. In this paper, several mixture regression-based methods were conducted under the assumption that the data come from a finite number of components. A comparison of these methods has been made according to their results in estimating component parameters. Also, observation membership has been inferred and assessed for these methods. The results showed that the flexible mixture model outperformed the
... Show MoreThe repeated measurement design is called a complete randomized block design for repeated measurement when the subject is given the all different treatments , in this case the subject is considered as a block . Many of nonparametric methods were considered like Friedman test (1937) and Koch test(1969) and Kepner&Robinson test(1988) when the assumption of normal distribution of the data is not satisfied .as well as F test when the assumptions of the analysis of variance is satisfied ,where the observations within blocks are assumed to be equally correlated . The purpose of this paper is to summarize the result of the simulation study for comparing these methods as well as present the suggested
Me
... Show MoreRegression models are one of the most important models used in modern studies, especially research and health studies because of the important results they achieve. Two regression models were used: Poisson Regression Model and Conway-Max Well- Poisson), where this study aimed to make a comparison between the two models and choose the best one between them using the simulation method and at different sample sizes (n = 25,50,100) and with repetitions (r = 1000). The Matlab program was adopted.) to conduct a simulation experiment, where the results showed the superiority of the Poisson model through the mean square error criterion (MSE) and also through the Akaiki criterion (AIC) for the same distribution.
Paper type:
... Show More In this paper the research represents an attempt of expansion in using the parametric and non-parametric estimators to estimate the median effective dose ( ED50 ) in the quintal bioassay and comparing between these methods . We have Chosen three estimators for Comparison. The first estimator is
( Spearman-Karber ) and the second estimator is ( Moving Average ) and The Third estimator is ( Extreme Effective Dose ) . We used a minimize Chi-square as a parametric method. We made a Comparison for these estimators by calculating the mean square error of (ED50) for each one of them and comparing it with the optimal the mean square
In this paper, we will study non parametric model when the response variable have missing data (non response) in observations it under missing mechanisms MCAR, then we suggest Kernel-Based Non-Parametric Single-Imputation instead of missing value and compare it with Nearest Neighbor Imputation by using the simulation about some difference models and with difference cases as the sample size, variance and rate of missing data.
The technology of reducing dimensions and choosing variables are very important topics in statistical analysis to multivariate. When two or more of the predictor variables are linked in the complete or incomplete regression relationships, a problem of multicollinearity are occurred which consist of the breach of one basic assumptions of the ordinary least squares method with incorrect estimates results.
There are several methods proposed to address this problem, including the partial least squares (PLS), used to reduce dimensional regression analysis. By using linear transformations that convert a set of variables associated with a high link to a set of new independent variables and unr
... Show More