August 2012
This month's newsletter is the first in a multi-part series on using the ANOVA method for an ANOVA Gage R&R study. This method simply uses analysis of variance to analyze the results of a gage R&R study instead of the classical average and range method. The two methods do not generate the same results, but they will (in most cases) be similar.
This newsletter focuses on part of the ANOVA table and how it is developed for the Gage R &R study. In particular it focuses on the sum of squares and degrees of freedom. Many people do not understand how the calculations work and the information that is contained in the sum of squares and the degrees of freedom. In the next few issues, we will put together the rest of the ANOVA table and complete the Gage R&R calculations.
In this issue:
- Sources of Variation
- Example Data
- The ANOVA Table for Gage R&R
- The ANOVA Results
- Total Sum of Squares and Degrees of Freedom
- Operator Sum of Squares and Degrees of Freedom
- Parts Sum of Squares and Degrees of Freedom
- Equipment (Within) Sum of Squares and Degrees of Freedom
- Interaction Sum of Squares and Degrees of Freedom
- Summary
- Quick Links
Any gage R&R study is a study of variation. This means you have to have variation in the results. On occasion, I get a phone call from a customer wondering why their Gage R&R study is not giving them any useful information. And, in looking at the results, I discover that each result is the same - for each part and for each operator. There is no variation. I am asked - Isn't it good that there is no variation in the results? No, not in a gage R&R study. It means that the measurement process cannot tell the difference between the samples. So remember, a gage R&R study is a study in variation - this means that there must be variation.
If you are not familiar with how to conduct a Gage R&R study, please see our December 2007 newsletter. This newsletter also includes how to analyze the results using the average and range method.
As usual, please feel free to leave comments at the end of the newsletter.
Sources of Variation
Suppose you are monitoring a process by pulling samples of the product at some regular interval and measuring one critical quality characteristic (X). Obviously, you will not always get the same result when measure for X. Why not? There are many sources of variation in the process. However, these sources can be grouped into three categories:
- variation due to the process itself
- variation due to sampling
- variation due to the measurement system
These three components of variation are related by the following:
where σ_{t}^{2} is the total process variance; σ_{p}^{2} is the process variance; σ_{s}^{2} is the sampling variance and σ_{ms}^{2} is the measurement system variance. Note that the relationship is linear in terms of the variance (which is the square of the standard deviation), not the standard deviation.
For our purposes here, we will ignore the variance due to sampling (or more correctly, just include it as part of the process itself). However, for some processes, sampling variation can greatly impact the results. Thus, we will consider the total variance to be:
Remember geometry? The right triangle? The Pythagorean Theorem? The above equation can be represented by the triangle below.
The total standard deviation, σ_{t}, for a measurement is equal to the length of the hypotenuse. The process standard deviation, σ_{p}, is equal to the length of one side of the triangle and the measurement system standard deviation, σ_{ms}, is equal to the length of the remaining side.
You can easily see from this triangle what happens as the variation in the product and measurement system changes. If the product standard deviation is larger than the measurement standard deviation, it will have the larger impact on the total standard deviation. However, if the measurement standard deviation becomes too large, it will begin to have the largest impact.
Thus, the objective of improving a measurement system is to minimize the % variance due to the measurement system:
% Variance due to measurement system = 100(σ_{ms}^{2}/σ_{t}^{2})
The gage R&R study focuses on σ_{ms}^{2}. In a gage R&R study, you can break down σ_{ms}^{2} into its two components:
Repeatability is the ability of the measurement system to repeat the same measurements on the same sample under the same conditions. It represents an assessment of the ability to get the same measurement result each time.
Reproducibility is the ability of measurement system to return consistent measurements while varying the measurement conditions (different operators, different parts, etc.) It represents an assessment of the ability to reproduce the measurement of other operators.
In this series, we will take a look at how the repeatability and reproducibility are determined using the ANOVA method for Gage R&R.
Example Data
We will re-use the data from our December 2007 newsletter on the average and range method for Gage R&R. In this example, there were three operators who tested five parts three times. A picture of part of the Gage R&R design is shown below.
Operator 1 will test 5 parts three times each. In the figure above, you can see that Operator 1 has tested Part 1 three times. What are the sources of variation in these three trials? It is the measurement equipment itself. The operator is the same and the part is the same. The variation in these three trials is a measure of the repeatability. It is also called the equipment variation in Gage R&R studies or the "within" variation in ANOVA studies.
Operator 1 also runs Parts 2 through 5 three times each. The variation in those results includes the variation due to the parts as well as the equipment variation. Operator 2 and 3 also test the same 5 parts three times each. The variation in all results includes the equipment variation, the part variation, the operator variation and the interaction between operators and parts. The variation in all results is the reproducibility.
The data from the December 2007 newsletter are shown in the table below.
Operator | Part | Results | ||
A | 1 | 3.29 | 3.41 | 3.64 |
2 | 2.44 | 2.32 | 2.42 | |
3 | 4.34 | 4.17 | 4.27 | |
4 | 3.47 | 3.5 | 3.64 | |
5 | 2.2 | 2.08 | 2.16 | |
B | 1 | 3.08 | 3.25 | 3.07 |
2 | 2.53 | 1.78 | 2.32 | |
3 | 4.19 | 3.94 | 4.34 | |
4 | 3.01 | 4.03 | 3.2 | |
5 | 2.44 | 1.8 | 1.72 | |
C | 1 | 3.04 | 2.89 | 2.85 |
2 | 1.62 | 1.87 | 2.04 | |
3 | 3.88 | 4.09 | 3.67 | |
4 | 3.14 | 3.2 | 3.11 | |
5 | 1.54 | 1.93 | 1.55 |
The operator is listed in first column and the part numbers in the second column. The next three columns contain the results of the three trials for that operator and part number. For example, the three trial results for Operator A and Part 1 are 3.29, 3.41 and 3.64.
We will now take a look at the ANOVA table, which is used as a starting point for analyzing the results.
The ANOVA Table for Gage R&R
In most cases, you will use computer software to do the calculations. Since this is a relatively simple Gage R&R, we will show how the calculations are done. This helps understand the process better. The software usually displays the results in an ANOVA table. The basic ANOVA table is shown in the table below for the following:
- k = number of operators
- r = number of replications
- n= number of parts
The first column is the source of variability. Remember that a Gage R&R study is a study of variation. There are five sources of variability in this ANOVA approach: the operator, the part, the interaction between the operator and part, the equipment and the total.
The second column is the degrees of freedom associated with the source of variation. The degrees of freedom are simply the number of values of a statistic that are free to vary. For example, suppose you have a sample that contains n observations. We use the sample to estimate something - usually an average. When we want to estimate something, it costs us one degree of freedom. So, if we have n observations and want to estimate the average, then we have n - 1 degrees of freedom left.
The third column is the sum of squares (SS) associated with the source of variation. The sum of squares is a measure of variation. It measures the squared deviations around an average. Remember what the equation for the variance is? The variance of a set of number is given by:
The sum of squares for the source of variation is very similar to the numerator. You just take the sum of squares around different averages depending on the source of variation.
The fourth column is the mean square associated with the source of variation. The mean square is the estimate of the variance for that source of variability based on the amount of data we have (the degrees of freedom). So, the mean square is the sum of squares divided by the degrees of freedom. Note the similarity to the formula for the variance above.
The fifth column is the F value. This is the statistic that is calculated to determine if the source of variability is statistically significant. It is the ratio of two variances (or mean squares in this case).
The ANOVA Results
The data above were analyzed using the SPC for Excel software. The resulting ANOVA table is shown below.
Source | df | SS | MS | F | p Value |
Operator | 2 | 1.630 | 0.815 | 100.322 | 0.0000 |
Part | 4 | 28.909 | 7.227 | 889.458 | 0.0000 |
Operator by Part | 8 | 0.065 | 0.008 | 0.142 | 0.9964 |
Equipment | 30 | 1.712 | 0.057 | ||
Total | 44 | 32.317 |
Let's see where the numbers come from.
Total Sum of Squares and Degrees of Freedom
The total sum of squares (SS_{T}) is the sum of the other sources of variability. So,
SS_{T} = SS_{0} + SS_{P} + SS_{0*P} + SS_{E}
The total sum of squares is the squared deviation of each individual result from the overall average - the average of all results. The overall average of the 45 results is:
The total sum of squares is then given by:
where X_{ijm} is the result for the i^{th} operator running the j^{th} part for the m^{th} trial. This equation is simply a fancy way of saying that you subtract the average from an individual result and square that result. This is shown in the figure below for the squared deviation of the first result.
If you do this for each point and add up the results, you will obtain the following:
SS_{T }= 32.317
The calculations are shown in the table below.
Operator | Part 1 | Trial 1 | Trial 2 | Trial 3 | Squared Deviation Trial 1 | Squared Deviation Trial 2 | Squared Deviation Trial 3 |
A | 1 | 3.29 | 3.41 | 3.64 | 0.120 | 0.217 | 0.485 |
2 | 2.44 | 2.32 | 2.42 | 0.254 | 0.389 | 0.274 | |
3 | 4.34 | 4.17 | 4.27 | 1.949 | 1.504 | 1.759 | |
4 | 3.47 | 3.5 | 3.64 | 0.277 | 0.309 | 0.485 | |
5 | 2.2 | 2.08 | 2.16 | 0.553 | 0.746 | 0.614 | |
B | 1 | 3.08 | 3.25 | 3.07 | 0.019 | 0.094 | 0.016 |
2 | 2.53 | 1.78 | 2.32 | 0.171 | 1.354 | 0.389 | |
3 | 4.19 | 3.94 | 4.34 | 1.553 | 0.992 | 1.949 | |
4 | 3.01 | 4.03 | 3.2 | 0.004 | 1.180 | 0.066 | |
5 | 2.44 | 1.8 | 1.72 | 0.254 | 1.308 | 1.498 | |
C | 1 | 3.04 | 2.89 | 2.85 | 0.009 | 0.003 | 0.009 |
2 | 1.62 | 1.87 | 2.04 | 1.752 | 1.153 | 0.817 | |
3 | 3.88 | 4.09 | 3.67 | 0.877 | 1.314 | 0.527 | |
4 | 3.14 | 3.2 | 3.11 | 0.039 | 0.066 | 0.028 | |
5 | 1.54 | 1.93 | 1.55 | 1.971 | 1.028 | 1.943 | |
Sum | 32.317 |
There were a total of 45 results. We calculated the overall average for these results. So the degrees of freedom associated with the total sum of squares are 45 - 1 = 44. This can also be calculated as nkr - 1.
Operator Sum of Squares and Degrees of Freedom
As mentioned before, you obtain the sum of squares by determining the squared deviations between two numbers. With the operator source of variability, you will obtain the squared deviations between the operator average and the overall average. Algebraically, this is given by:
where nr represents the number of results for operator i and the "i.." subscript means over all parts and trials for operator i.
In this example, n = 5 and r = 3, so there are 15 results for each operator. The table below shows how the calculations are done:
Operator | Parts | Trial 1 | Trial 2 | Trial 3 | Operator Average | Squared Deviation for Operator |
A | 1 | 3.29 | 3.41 | 3.64 | 3.1567 | 0.0453 |
2 | 2.44 | 2.32 | 2.42 | |||
3 | 4.34 | 4.17 | 4.27 | |||
4 | 3.47 | 3.5 | 3.64 | |||
5 | 2.2 | 2.08 | 2.16 | |||
B | 1 | 3.08 | 3.25 | 3.07 | 2.9800 | 0.0013 |
2 | 2.53 | 1.78 | 2.32 | |||
3 | 4.19 | 3.94 | 4.34 | |||
4 | 3.01 | 4.03 | 3.2 | |||
5 | 2.44 | 1.8 | 1.72 | |||
C | 1 | 3.04 | 2.89 | 2.85 | 2.6947 | 0.0621 |
2 | 1.62 | 1.87 | 2.04 | |||
3 | 3.88 | 4.09 | 3.67 | |||
4 | 3.14 | 3.2 | 3.11 | |||
5 | 1.54 | 1.93 | 1.55 | |||
Sum of Deviations | 0.1087 | |||||
15(Sum of Deviations) | 1.6304 |
Thus,
SS_{O} = 1.6304
So, you can see that the sum of squares due to the operators is based on how the operator averages deviate from the overall average. There are three operator averages. Since we calculated the overall average, we lost one degree of freedom. The degrees of freedom associated with the operators are 3 - 1 = 2, or k -1 = 2.
The variability chart below shows the results by operator by part. The horizontal blue line is the average for the operator. The horizontal green line is the overall average. The difference between those two lines is the deivation.
Parts Sum of Squares and Degrees of Freedom
The sum of square due to the parts is done in the same manner as for the operators except the average you are focusing on are the part averages. Algebraically, the equation for SS_{P} is:
where kr is the number of results for a given part (3 operators, 3 trials) and the subscript ".j." is the average of the results for part j across all operators and trials. The table below shows the calculations. The original data has been sorted by part.
Part | Trial 1 | Trial 2 | Trial 3 | Part Average | Squared Deviation for Part |
1 | 3.29 | 3.41 | 3.64 | 3.1689 | 0.0507 |
1 | 3.08 | 3.25 | 3.07 | ||
1 | 3.04 | 2.89 | 2.85 | ||
2 | 2.44 | 2.32 | 2.42 | 2.1489 | 0.6318 |
2 | 2.53 | 1.78 | 2.32 | ||
2 | 1.62 | 1.87 | 2.04 | ||
3 | 4.34 | 4.17 | 4.27 | 4.0989 | 1.3343 |
3 | 4.19 | 3.94 | 4.34 | ||
3 | 3.88 | 4.09 | 3.67 | ||
4 | 3.47 | 3.5 | 3.64 | 3.3667 | 0.1788 |
4 | 3.01 | 4.03 | 3.2 | ||
4 | 3.14 | 3.2 | 3.11 | ||
5 | 2.2 | 2.08 | 2.16 | 1.9356 | 1.0165 |
5 | 2.44 | 1.8 | 1.72 | ||
5 | 1.54 | 1.93 | 1.55 | ||
Sum of Deviations | 3.2122 | ||||
9(Sum of Deviations) | 28.9094 |
Thus,
SS_{P} = 28.9094
Again, you can see how the sum of square due to parts is based on how the part averages deviate from the overall average. There are five parts. Again, we calculated the overall average, so one degree of freedom is lost. There are n - 1 = 5 -1 = 4 degrees of freedom associated with the parts sum of squares.
Equipment (Within) Sum of Square and Degrees of Freedom
The equipment sum of squares uses the deviation of the three trials for a given part and a given operator from the average for that part and operator. This can be expressed as:
The calculations are shown in the table below.
Operator | Parts | Trial 1 | Trial 2 | Trial 3 | Average of 3 Trials |
Squared Deviation Trial 1 |
Squared Deviation Trial 2 | Squared Deviation Trial 3 |
A | 1 | 3.29 | 3.41 | 3.64 | 3.447 | 0.025 | 0.001 | 0.037 |
2 | 2.44 | 2.32 | 2.42 | 2.393 | 0.002 | 0.005 | 0.001 | |
3 | 4.34 | 4.17 | 4.27 | 4.260 | 0.006 | 0.008 | 0.000 | |
4 | 3.47 | 3.5 | 3.64 | 3.537 | 0.004 | 0.001 | 0.011 | |
5 | 2.2 | 2.08 | 2.16 | 2.147 | 0.003 | 0.004 | 0.000 | |
B | 1 | 3.08 | 3.25 | 3.07 | 3.133 | 0.003 | 0.014 | 0.004 |
2 | 2.53 | 1.78 | 2.32 | 2.210 | 0.102 | 0.185 | 0.012 | |
3 | 4.19 | 3.94 | 4.34 | 4.157 | 0.001 | 0.047 | 0.034 | |
4 | 3.01 | 4.03 | 3.2 | 3.413 | 0.163 | 0.380 | 0.046 | |
5 | 2.44 | 1.8 | 1.72 | 1.987 | 0.206 | 0.035 | 0.071 | |
C | 1 | 3.04 | 2.89 | 2.85 | 2.927 | 0.013 | 0.001 | 0.006 |
2 | 1.62 | 1.87 | 2.04 | 1.843 | 0.050 | 0.001 | 0.039 | |
3 | 3.88 | 4.09 | 3.67 | 3.880 | 0.000 | 0.044 | 0.044 | |
4 | 3.14 | 3.2 | 3.11 | 3.150 | 0.000 | 0.003 | 0.002 | |
5 | 1.54 | 1.93 | 1.55 | 1.673 | 0.018 | 0.066 | 0.015 | |
Sum | 1.712 |
Thus,
SS_{E} = 1.712
Again, note that the sum of squares is examining variation around an average. For the within variation, it is the variation in the three trials around the average of those three trials.
We calculated an average for each set of three trials. So, we lost one degree of freedom for each set of three trials or r - 1. There were nk set of three trials, so the degrees of freedom associated with the equipment variation is nk(r-1) = 30.
The variability chart below shows the results by part by operator.
Interaction Sum of Square and Degrees of Freedom
We will make use of the equality stated earlier to find the interaction sum of squares. This equality was:
SS_{T} = SS_{0} + SS_{P} + SS_{0*P} + SS_{E}
SS_{0*P} = SS_{T}- (SS_{0} + SS_{P} + SS_{E})
SS_{0*P} = 32.317 - (1.63 + 28.909 + 1.712)
SS_{0*P} = 0.065
The same equality holds for the degrees of freedom:
df_{0*P} = df_{T }- (df_{0} + df_{P} + df_{E})
df_{0*P} =44 - (2 + 4 + 30)
df_{0*P} = 8
Summary
This is the first of a multi-part series on using ANOVA to analyze a Gage R&R study. It focused on providing a detailed explanation of how the calculations are done for the sum of squares and degrees of freedom. We will finish out the ANOVA table as well as complete the Gage R&R calculations in the coming issues.
Quick Links
SPC for Excel Software - Version 5 is now available!
Thanks so much for reading our publication. We hope you find it informative and useful. Happy charting and may the data always support your position.
Sincerely,
Dr. Bill McNeese
BPI Consulting, LLC
Comments (3)
Very well written post. It will be valuable to anybody who utilizes it, as well as myself. Keep doing what you are doing – for sure i will check out more posts.
Sum of squares for parts are wrong because, the upper limit for the summation should be n(number of parts) instead of k.
Thanks for catching that. It has been changed.
Leave a comment