SuperCoach Points Distribution

Written by Schwarzwalder on February 4 2019

(Written & Created By AllSaints)

Dunno about everyone else, but I’ve often wondered if certain sides are more ‘SC-friendly’ than others when it comes to taking a greater share of SC points than the result of the actual game would indicate. Well there are! And this piece is about just that. So what are we gonna look at?

1. How the points get divvied up?

2. Identifying which clubs were and weren’t SC-friendly in 2018

Ladder position vs SC pts position

3. Despite losing the AFL game, who manages to win the SC pts game and

4. Despite winning the AFL game, who manages to lose the SC pts game

5. Ave. scores for winning sides and losing sides

What are the implications for sides we expect to improve/worsen in 2019?

6. Correlation between Round winners %ages and the SC scores of winning sides, and finally

7. Gini-coefficients by club (more on this later)

* please note, I have only looked at individual games upto and including Rd14, that is 13games/side, in the in-depth analysis

 

Introduction

As many SCTers are aware, the Supercoach Gods scale the scores at the end of each AFL game to get as near as dammit to a total of 3,300 SCpts, to divvy up between the two sides. The average for the entire 2018 season was 3300.008 points per game! Like I say, near as dammit.

This averages out to 1650pts, per side, per game but of course it never works out that way.

There are very few games (less than 10) that err more than two points away from this figure, with one game delivering a high of 3304 (ADE v GWS which was a ripper) and one game a low of 3296 (we don’t need to go there).

The biggest margin in a single game was GWS v GCS in Rd12 (2001pts to 1299). Interestingly GWS also achieved the second highest score (1943) in their trouncing of the Dogs (1356) in Rd1!

The biggest reverse margin in a game was RIC v COL in Rd6. Richmond bt Collingwood 113-70, BUT the SC points were spectacularly different. RIC 1563 lost to COL 1734, which brings us nicely on to the next piece …

 

Winners & Losers

The following sides managed to gain the majority of SC pts despite losing the corresponding AFL game, on multiple occasions: WBD (three times), COL, GEE and BRL (twice each) and STK (once plus in the draw with GWS)

The following sides managed to forego the majority of SC points despite winning the game, on multiple occasions: RIC, WCE, SYD and GEE (all twice)

The Cats are the only side to appear on both lists.

These are specific events but to see if there are underlying trends, we need to look at the bigger picture. The Chart below details all the summary data used for this analysis, let’s take a closer look:

 

I’ve ranked the sides based on their SC ave. scores relative to their ladder position, for all games upto and including Rd14. The best performers (up a min. of three places) are: GWS (five places), COL, PTA and MEL (all up 3 spots). The worst performers are: RIC (down 8 places), WCE (6), NTH and GCS (both 3). There’s a lot more to see hear. To make it easier to read, all figures in green are positive for SC and all figures in red (or orange) are negative, relative to each other.

 

Winning and Losing Averages

The average SC score of a winning team in Rds1-14 was: 1762.7

The average SC score of a losing team in Rds1-14 was: 1537.3

That’s a 225.4 pts differential, on average. That equates to an average of 10.25pts/player (+5 for the winning side and -5 for the losers).

The chart below shows the correlation between winning margins (measured in %ages, as per the AFL ladder) by Round, against the ave. SC scores of the winning teams that week. To see if there is a correlation, I then ranked the data by largest SC ave. scores of winning teams against the %ages by which they won the AFL game in the corresponding Round.

The green line is the standard distribution of the AFL % figures, which compared to the SC scores index (blue line) does suggest a correlation, but there are significant extremes either way that it seems there might not be much in this. Most statisticians would likely take the two extremes (highest and lowest) out of the analysis (admittedly with a lot more data to hand). If we were to do that, then it would be compelling. What does that all mean then? Well it basically says that, on average, a higher winning %age generally does deliver a greater share of SC points, but there are exceptions.

 

Gini-coefficients

Back in the days of studying development economics, we would do in-depth analyses of income inequality, by continent and country. To do this you would use something called a GINI-coefficient, which essentially measures the degree of asymmetry in income levels across an entire population. The higher the coefficient, the greater the degree of asymmetry, or inequality. So, what does that mean for SuperCoach scores? Well, let’s call our teams the countries and their players, the population. Let’s call it the Hagman Index (genie? lame at all?!?). For this analysis, I have only included players who played 7 or more games to minimise any skew in the data. What the below Hagman indices then tell us are that teams like Collingwood and Hawthorn, have a greater number of higher and lower SC scorers than the average (and with relatively few in between), while North and the Lions have a much more even distribution of points across their playing lists. What it tells us is that decent sides who appear on the left-hand side of the below chart are likely to deliver consistently high scorers (plural), every week, relative to those on the right-hand side.

I think I might have overthunk this! Let’s just sum up.

 

Conclusion

What the analysis does clearly indicate is:

 There is a definite correlation between winning AFL games and SC points earned

o Likely an ave. 10pts/player (+/- per side)

o So teams who improve in 2019 should score more SC points and those teams on the slide might earn less.

 I would add that the teams listed below who are also SC-friendly will likely gain more than 10pts/player on average, while those unfriendly sides will likely claim less (so look out)

 There are definitely sides that are more SC-friendly than the average, namely

o COL, GWS, MEL, PTA, WBD, STK and BRL

And there are definitely sides who are not so friendly

o RIC, WCE, NTH and SYD,The rest should tend to the norm.

Hope it proves of some value. Happy planning SCTers!!

 

20
1


Leave a comment / Scroll to bottom

12 thoughts on “SuperCoach Points Distribution”

  1. That’s excellent Allsaints! I’m starting to cast some doubts over Anthony Miles’ output in a seemingly poor GCS side. His 95+ averages in 2014 & 2015 came when Richmond made the 8 whilst his average dropped to 89 in 2016 when they finished 13th. With less points to go around I feel an 85-90 average is looking more likely than 95-100. Anyone else with me on this?

    4

    1
    1. My concerns for miles are quite different ash – they are that he will be the same as Barlow or Lyons at the suns. A good scorer but not playing every week!

      4

      1
    2. Thanks Ash. I’m less and less convinced by Miles, BUT I still think he’s a value pick. If he averages 88, he’s gonna make you $100k+ and likely give you more pts/wk than a rookie.
      Whilst I take Duffer’s points on board, I’d have thought he’ll have a more permanent role, with the younger kids more likely to be rotated/rested. If he can average 90+ in the first 6-7games, job done. Can’t see him being left out that early in the season.
      For me, his selection or not, will be based purely on the number of decent mature-aged rookies available in the Midfield come Rd1. 6 or less, yes, 7 or more, no.

      5

      0
    3. Hey Ash

      Just looking at this analysis again, I don’t think your concerns about Miles need worry you too much. If you look at GCS’ Hagman Index, like COL they tend to have more SC outputs at the extremes. So while they may not get as many pts each week as other teams, we can be sure they will have a fair number of woeful scorers, but there are also likely to be a few high-scorers who stand out from the pack each week too. While the names may change each week, you might expect Miles to be one of the more permanent fixtures in that mix.

      Hope that helps!

      1

      0
  2. Great analysis AS. Given there is always going to be some subjective assessment of each supercoach scoring event in AFL games – does your research suggest that the people at Championhip data may be knowingly or unknowingly biased? i.e. was that a hitout to advantage or not, was that a contested ball?

    2

    0
    1. It doesn’t unfortunately. Perhaps we can start a weekly thread on “Flavours of CD” to accompany the brilliant “Flavour of the Week” piece.
      People can add in this what they noticed from CD over the last weekend’s Round.
      eg Hutta stating that the Bont used to get pts for tying his boot-laces! Players like Macrae, Grundy (less so) seemed last year to get a bunch of pts over and above the norm on numerous occasions. It might help us all to identify positive/negative bias and therefore potential targets/no-no’s. There always seem to be comments in specificmatch-day threads, but to have a designated spot for this type of discussion could prove valuable to all SCTers.
      Just a thought bubble.

      5

      0
  3. Nope I don’t see it.
    “What it tells us is that decent sides who appear on the left-hand side of the below chart are likely to deliver consistently high scorers (plural), every week, relative to those on the right-hand side.”
    “There are definitely sides that are more SC-friendly than the average, namely COL, GWS, MEL, PTA, WBD, STK and BRL”
    BRL is 2nd from bottom on right (not-friendly side) StK is 5th from top on the left (is friendly side) ????????????

    0

    0
    1. Hey Kev

      There are essentially two separate analyses. The first (with the colour excel chart) is explaining how some teams have a greater propensity for SC points than their relative AFL results suggest. Similarly, there are others that don’t.

      The ones that are SC-good (relative to actual match results), ie have a particularly SC-friendly game, are:
      WBD, COL, BRL, STK, GWS, PTA and MEL (HAW maybe too).
      The ones that are SC-bad (relative to AFL results) are:
      RIC, WCE, NTH and GCS
      These are ranked in order of SC-relevance (ie Dogs are best, Tigers are worst).

      The second analysis is based on the GINI-coefficient (and I would put a lot less weight on these results). This analysis looks at how points are shared within each team and has NO BEARING on how good or bad they are at Supercoach. The higher the ‘Hagman’ value is, is simply telling us that there is much higher inequality within that team’s weekly scoring. For example, the analysis shows that COL tend to have comparatively more high AND low SC scorers each week and comparatively few in that middle scoring bracket (60-80), when compared to all other teams. At the other extreme, NTH will share the SC points each week much more evenly across their 22 players (ie far less extreme high and low scorers). This analysis is independent of how relatively good or bad a team is at scoring SC points, it is simply trying to convey HOW their points tend to be shared across their playing list each week.

      I hope that helps you make a bit more sense of it.

      2

      0
      1. Thank mate I understand it a bit better now. Think I must just stick a players recent historical form for now, that’s gotten me a couple top-100 finishes in the past five years so, I think the best way for me to improve my consistency break my early season trade addiction.

        2

        0
  4. Below are the AVE. SC scores by club for their entire seasons (incl. finals), ranked by SC ave.

    COL 1729.9
    MEL 1708.7
    GEE 1699.3
    PTA 1696.5
    HAW 1695.8
    GWS 1690.9
    ESS 1681
    WCE 1681
    SYD 1670.3
    ADE 1662.6
    RIC 1657
    BRL 1635.7
    NTH 1632.1
    WBD 1629.1
    STK 1610.5
    FRE 1583
    CAR 1515.8
    GCS 1482.5

    PLEASE NOTE that these stats are sourced from footywire, whereas my initial analysis of the first 14 rounds’ data was taken DIRECTLY from SuperCoach (post-scaling) itself.
    I have noticed doing other analyses that there can be anomalies/discrepancies across the two sources.

    2

    0
  5. Great work, Allsaints.

    One thing I would like to point out to people though, is that a lot of the ±10 point effects of winning/losing will likely be concentrated in the middle-to-lower rungs of players, rather than at the top end.

    Furthermore, the impact of their team’s performance on their SuperCoach output is also likely to depend on their role; because of the way the system is designed, there is a lot of emphasis on on-ball production, so inside midfielders are unlikely to be affected. Forwards, whose supply depends on said midfielders, are likely to be affected, and there could also be an inverse effect on intercept/rebounding defenders, with more of the ball in their part of the ground likely to translate into bigger scores (although in the case of intercept players, this is somewhat dependent on there still being sufficient pressure up the ground to give them a chance).

    4

    0
  6. Hey AllSaints, good work, but I’m not sure that you covered off point 1. ‘How the points get divvied up”.

    So there is a fixed number of points allocated per game .. there are fixed points allocated to specific actions per player.. So how is the adjustment made .. is it a % across all players or across a specific score type? And do they round up or down.. would be an interesting formula… or do they just make arbitrary adjustments based on perceptions..
    food for thought ..

    0

    0

Leave a Reply

Your email address will not be published. Required fields are marked *