Six Sigma is a widely used method to improve processes from various industry sectors. The target failure
rate for Six Sigma projects is 3.4 parts per million or 2 parts per billion. In this paper, we show that when a
process is exponential, attaining such performances may require a larger reduction in variation (i.e., greater
quality-improvement effort). In addition, identifying whether the process data are of non-normal distribution
is important to more accurately estimate the effort required to improve the process. A key finding of this study
is that, for a low kσ level, the amount of variation reduction required to improve an exponentially distributed
process is less than that of a normally distributed process. On the other hand, for a higher kσ level, the reverse
scenario is the case. This study also analyzes processes following Gamma and Weibull distributions, and the
results further support our concern that simply reporting the Sigma level as an indication of the quality of
a product or process can be misleading. Two optimization models are developed to illustrate the effect of
underestimating the quality-improvement effort on the optimal solution to minimize cost. In conclusion,
the classical and widely used assumption of a normally distributed process may lead to implementation of
quality-improvement strategies or the selection of Six Sigma projects that are based on erroneous solutions.
© 2015 Elsevier B.V. and Association of European Operational Research Societies (EURO) within the
International Federation of Operational Research Societies (IFORS). All right