Some CoP statistics | Golden Skate

Some CoP statistics

Joined
Jun 21, 2003
The CoP is intended to inject a certain amount of uniformity and objectivity into the judging process. Our main statistical tool for deciding whether this goal is being met is called "Analysis pf Variance."

Here is a simplified example. There are two skaters and three judges. The scores were

Skater A: 2 3 4 (average 3)
Skater B: 4 5 6 (average 5)

The grand mean of all of these six numbers is 4. But some of the scores are higher than 4 and some are lower. Why?

There are two factors. First, the scores might be different because the performances of the skaters are different: Skater B was better in this competition.

Secondly, the scores might be different because the criteria of the judges are different: Judge #1 appears to be stingier with his/her marks across the board, and Judge #3 is more generous.

The technique of Analysis of Variance assigns numbers to gauge the relative importance of these two sources of variation. First we compute the total variation. For each of the six scores, how far is it from the average value of 4? Square the differences and add them up:

Sum of squares (total) = 4+1+0+0+1+4 = 10 units of variation in all.

To measure how much of this variation is due to the actual skating performances, we eliminate the differences in the judges by replacing each judge's score with the average for that skater.

Skater A: 3 3 3
Skater B: 5 5 5

Now the variation among the scores (the sum of the differences between the score and the grand average of 4) is

Sum of squares (Skaters) = 1+1+1+1+1+1 = 6

Thus 6 units of variation, or 60% of the total, reflect the fact that skater B really did outperform skater A.

But how much variation is there among the three judges? This time we are judging the judges, not the skaters, so we replace the scores by the average for each judge.

Skater A: 3 4 5
Skater B: 3 4 5

Sum of squares (judges) = 1+0+1+1+0+1 = 4 units of variation.

So in this example 60% of the total variation is correlated with the actual skating performances and 40% with the personal peccadilloes of the judges.

In a perfectly objective scoring system we would have 100% skaters, 0% judges.
 
Last edited:
Joined
Jun 21, 2003
So how does it work out in practice?

Just for fun I did an Analysis of Variance on the total component scores for the top four ladies in the Euros LP. Since we are in part judging the judging panel, I included the scores of all twelve judges.

The results were:

Total variation -----------------------190 units
Variation due to SKATERS---------89
Variation due to JUDGES-----------35
Variation due to all other factors---66

The "all other factors" part is called "sampling error" (statistical noise).

These raw figures, however, are somewhat misleading because they are based on different sample sizes (4 skaters versus 12 judges). After compensating for this, the comparable numbers are

Mean square (SKATERS) = 30
Mean square (JUDGES) = 3
Mean square (Error) = 2


So the conclusion is, unless the judges are in collaboration or simply recording pre-determined evaluations, the CoP seems to be doing its job of rewarding the best performances.

Mathman:)
 

ChiSk8Fan

On the Ice
Joined
Nov 7, 2004
Anova

ANOVA ia a well known, traditionally used statistical tool taught in the very basics of the study of statistics. It is as basic as mean, median, mode, standard deviation, Chi Square and other statistical techniques. Software programs easily perform ANOVA on numerical data.

As this is SO basic, it seems that the monitoring of the way the CoP is functioning should be easy if the ISU has bothered to consult a statistician trained in the very basics of statistics. But, it doesn't seem to be doing this simple work to refine the system, at least not publicly.

ANOVA of the CoP should be presented in all its objectivity to the ISU and all member federations, and a vote on whether or not to change it or keep it should be done, as to increase the variance of skaters' performance and decrease the variance of the judges' performance.

This is so basic and rhetorical that as a fan and student, I find it maddening that they don't do it to make our sport fair.
 
Joined
Jun 21, 2003
I have always assumed that the ISU is in fact keeping tons of statistics and running them through all sorts of analysis. For instance, it is certainly easy enough to test whether the scores of two or more judges have a suspiciously high correlation. I fear that this information will never be made public.

IMHO this shroud of secrecy over the inner workings of the entire skating world is a public relations disaster for the ISU and the whole sport.

MM
 
Last edited:

Doggygirl

Record Breaker
Joined
Dec 18, 2003
Thanks Mathman!!

I always find your analysis interesting. You already know that my fondest hopes are that CoP is working - on both judging and political fronts. (eternal optimist) I just find this information hopeful. :)

Now we know what Mathman does on Superbowl Sunday. :)

DG
 
Top