Featured post

Textbook: Writing for Statistics and Data Science

If you are looking for my textbook Writing for Statistics and Data Science here it is for free in the Open Educational Resource Commons. Wri...

Sunday, 29 March 2015

Laser Tag Rating System

This is a rough outline of an idea that Phil Ophus and I had. We want a rating system for laser tag. We want it to be a cumulative reflection of a player's skill over multiple games. We want a means to compare the laser tag skill of players that have a substantial play record, but not necessarily against each other in the same match. In short, we want an Elo or Trueskill style metric that is a shorthand for "this player is X skilled at laser tag".

However, laser tag has less standardization than the sports and games that these metrics are usually applied to.


As it is now, the single game score cards don't give any context of a skill outside of the single match displayed. Scores in general are higher in matches with many players and longer 'ironman' matches. A score of 50000 can demonstrate just as much achievement as a score of 100000 against equally skilled opponents by these factors alone.


There are strategy factors that affect score that don't fairly reflect skill such as picking mostly weak targets, and aggressive running around play rather than base protection. Besides within-game luck, other random noise is added in from equipment effects; some guns are in better shape than others.


Scores are naturally different between free for all and team games, and some players do better in different formats. However, it's reasonable to assume a strong enough association between the skill levels of a given player across formats that one format can inform the other.

All of this variation, and this is only from a single location: Planet Lazer near Braid Station, New Westminster. At this location, the scoring system rewards hitting much more than it penalizes being hit. Also, every target on a vest is worth the same amount, although this isn't necessarily true at other locations.

We want a ratings method that can be used to compare players in different arenas that may be using different rules. Ideally something anyone could see how well they stack up on a regional up to a worldwide level. However even if we only use places that use comparable equipment, the arenas are substantially different, whereas in many other sports the arena effect is negligible. The rules and scoring systems even differ from place to place. 

Our intuition and short train ride's worth of literature searching suggest that no such system exists yet that can handle the non uniform, free for all situations of laser tag. I'm hoping that further developing the cricket similalulator to handle data of cricket players that are compete in multiple formats for multiple teams in a single year.        

On the sampling design and logistics side, there are issues with data collection. What if a location's equipment doesn't record a key variable? How long is data retained? Are there enough return players? It seems like the next step is to draw up a research proposal, and bring it to planet Lazer and see if they would let us record their player data like that.

For after the thesis, if at all.

1 comment:

  1. Good summary, I think you hit all the main points we talked about.

    I was thinking it might also be interesting to try something on a smaller scale, local to our group. We wouldn't have quantity, but we could make up for it with the quality of data we could capture (i.e. greater depth, as detailed as we like).

    Maybe something with board games. I wonder if it's possible to have a viable system that spans multiple different games (assuming they are at least somewhat related). Perhaps we could break each game down into the skills it requires, and the degree to which it requires them, and go from there.

    I haven't looked into this at all so maybe something like this already exists, but I think it could be useful. It seems intuitively obvious that someone who is good at one game of a certain type will be good at other games of that type, but what about games of other types? What if those games have partial overlap in the skills required, or no overlap at all?

    For example, based on the skills required by Dominion (e.g. strategy, adaptability, math), we can say that a good Dominion player will probably also be good at similar games like MtG or 7 Wonders, but how good will he be at something like Coup (bluffing, perception, memory)? What about something that has elements of both, like poker?

    Just a random idea that might be easier to play around with considering the accessibility of data.

    ReplyDelete