Historical Chatterbowl Dominance Report

For the longest time I have been searching for a way to measure relative performance across the entire league, and for season against season. Fantasy football performance can’t be directly compared year one year because the source of the point scoring, the NFL, has so much variance. League trends mean points scoring, both in real life and in fantasy leagues, can vary significantly from one year to the next. One season an average team may score a century every other week, the next season an average team scores one only every 6 weeks. A team in the second of those two seasons may totally dominate the league, but their numbers look very ordinary compared to teams in the first season, so how do you compare them and determine the best?

Finally, I think I have a reliable system to compare teams year on year. Let me explain the maths…

In any given week there are 16 scores produced by the teams in the Chatterbowl. These obviously vary, sometimes hugely. The first step of this methodology is to calculate the standard deviation of this collection of 16 scores. Thanks to Microsoft Excel, this is as simple as using the formula =stdev([range of scores]). This will give you a figure which, typically over the past 6 seasons of the Chatterbowl, comes in somewhere between 15 and 25, but occasionally outside this range. The higher the number, the more variability there is across the 16 scores.

To explain, one standard deviation (what is calculated above), is the range above and below the mean that you would expect 68.2% of all scores within that range to fit into. Two standard deviations is used for a lot of statistical models because it equates to approximately 95% of a range, but we don’t need to do that for our purposes. Instead, we go onto step two…

Step two is to take the score of an individual team, subtract the average score that week from it, and then divide the result by the Standard Deviation calculated in step one, above.

This step essentially calculates how many standard deviations away from the mean the given team’s score was that week. If the standard deviation were exactly 20 and a team scored 20 more than the mean then the score calculated would be 1.00. However, if another team scored 10 below the mean, the score calculated would be -0.50. And so on.

In fact, here’s a real world example. In week 1 of the 2017 season the average score was 74.44, while the standard deviation was 20.97. Chris Hill scored lowest, with 43 points, which is 31.44 below the mean, so gives a score of -1.50, while the top score was by Steve Smith at 117. This is 42.56 above the mean and gives a score of 2.03.

Step three is to do this for every week of the season and then take an average of all the weeks. Finally, take the square root of this average and you have a final score or rating. Now, full disclosure here – I do not now recall the reason for doing this, though I am sure someone more statistically sound than I will be able to tell you where this element of the methodology comes from. Honestly though, there is a proper reason for doing this.

I have done this in three different ways – for the regular season, for the playoffs, and for the entire season as a whole, and have done this for each season the Chatterbowl has been running, since 2012. From this there are some interesting scores that have come out. In total there have been 92 seasons completed so far, and the top ten regular season scores are as follows:

 Rank GM & Year Regular Season Score
1 Chris Braithwaite 2014 1.036
2 Mat Ward 2017 0.827
3 Neil Hawke 2017 0.824
4 David Slater 2013 0.821
5 Chris Braithwaite 2013 0.811
6 Ben Hendy 2017 0.809
7 David Slater 2015 0.795
8 Jamie Blair 2015 0.735
9 Chris Braithwaite 2012 0.716
10 Dan Smith 2013 0.685

Some interesting things here:

  • Chris Braithwaite’s 2014 season was astonishingly consistent – during the regular season he was never below zero in his score. However, he truly shat the bed in the playoffs, scoring -0.73 in the first round.
  • None of these teams won the Chatterbowl
  • Three of the top 6 teams came from 2017 which was by far the worst scoring season so far. The average weekly score in 2017 was 78.85 – the previous worst was 83.08 in 2016 while the highest was 86.26 in 2012 (the single 12 team season, which would be expected to have a higher average points total)

The table below gives you the regular season performance of each Chatterbowl winner:

 Rank GM & Year Regular Season Score
11 C – Ben Hendy 2014 0.675
18 C – Max Cubberley 2012 0.614
31 C – David Slater 2016 0.364
36 C – James Goodson 2017 0.348
38 C – Pete Conaghan 2013 0.295
49 C – Ben Hendy 2015 -0.191

The playoffs, unsurprisingly, tend to show slightly better performance from the eventual champions and, with them only having to take 3 games into account, are liable to higher scores overall.

 Rank GM & Year Playoffs Score
1 C – Ben Hendy 2015 1.360
2 C – Ben Hendy 2014 1.300
3 Chris Braithwaite 2012 1.215
4 Dan Smith 2013 1.210
5 Max Cubberley 2015 1.171
6 Jay Kelly 2014 1.021
7 C – David Slater 2016 1.012
8 Jay Kelly 2017 0.947
9 Dan Sayles 2013 0.926
10 Mat Ward 2017 0.924

Of those seasons, only Ben Hendy 2015 and Jay Kelly 2014 had regular season performance worse than zero. The full list of champions is below.

 Rank GM & Year Playoffs Score
1 C – Ben Hendy 2015 1.360
2 C – Ben Hendy 2014 1.300
7 C – David Slater 2016 1.012
11 C – Pete Conaghan 2013 0.923
12 C – Max Cubberley 2012 0.920
22 C – James Goodson 2017 0.681

Next, overall performance for the full season – regular and playoff (including the split):

Rank GM & Year Regular Playoffs Full
1 Chris Braithwaite 2014 1.036 -0.026 0.934
2 Mat Ward 2017 0.827 0.924 0.848
3 David Slater 2013 0.821 0.914 0.839
4 Chris Braithwaite 2012 0.716 1.215 0.832
5 C – Ben Hendy 2014 0.675 1.300 0.829
6 Neil Hawke 2017 0.824 0.816 0.822
7 Dan Smith 2013 0.685 1.210 0.810
8 David Slater 2015 0.795 0.638 0.768
9 Jamie Blair 2015 0.735 0.709 0.730
10 Chris Braithwaite 2013 0.811 -0.346 0.715

And for the Champions alone:

Rank GM & Year Regular Playoffs Full
5 C – Ben Hendy 2014 0.675 1.300 0.829
12 C – Max Cubberley 2012 0.614 0.920 0.681
19 C – Ben Hendy 2015 -0.191 1.360 0.563
23 C – David Slater 2016 0.364 1.012 0.548
28 C – Pete Conaghan 2013 0.295 0.923 0.480
33 C – James Goodson 2017 0.348 0.681 0.430

And finally, while I don’t want to dwell on it, I know you’ll all want to know about the 10 worst seasons ever (2013 was not a good year), so without comment, here they are:

Rank GM & Year Regular Playoffs Full
83 Jay Kelly 2016 -0.775 0.118 -0.697
84 Neil Hawke 2015 -0.733 -0.541 -0.701
85 Jay Kelly 2013 -0.684 -0.970 -0.746
86 Chris Hill 2015 -0.801 -0.624 -0.771
87 Max Cubberley 2016 -0.672 -1.184 -0.794
88 Geoffrey Manboob 2013 -0.916 -0.538 -0.858
89 Philip Malcolm 2013 -0.824 -1.033 -0.867
90 Ben Hendy 2013 -0.850 -1.024 -0.885
91 Philip Malcolm 2014 -0.929 -0.959 -0.935
92 Jamie Blair 2017 -0.940 -1.286 -1.014

Oh, and this data is available here:Chatterbowl Database Download

Commish

I am the Commissioner of the DynaBowl Fantasy Football Dynasty League. What I say goes.

Leave a Reply