Rank Instability on OGS

Hi @AdamR

There could be an issue with the way, how OGS calculates the deviation of the players ratings. This results in high fluctuations of the players rank.

I found this, because I wanted to understand Glicko2 in deep and to get a better insight in how OGS implements it.

In the plot below, you can see my rating as shown by OGS in the rating history (upper black line)
and the ratings as calculated by me (upper green line).
The lines on the bottom of the graph are their deviations.

The red lines are a calculation with my implementation, but forcing the deviation to be the same as OGS ones.

As you can see, the green line is much more stable than the OGS rating, while the red line follows OGS, while the deviation of the OGS data is much higher (about 2 times, didn’t calculate the exact number).

I hope this helps to pin down the maybe issue with the unstable ranks. Sorry for the necro.


Implementation details:

I implemented Glicko2 as described in the Glicko paper http://www.glicko.net/glicko/glicko2.pdf. I tested my implementation against this one https://github.com/sublee/glicko2 and found no errors.
All OGS-data (player ratings, opponent ratings, game results) I used are the ones in https://online-go.com/termination-api/player/449941/rating-history.

If you need more details, just ask. If you need them, I can also send you my python scripts, too.

13 Likes