Monday, March 28, 2005

Curmudgeon Gamer on Reader Reviews

The piece is titled GameSpot vs. the Public and the initial post poses a guessing game between two distribution graphs: one of Gamespot's professional reviews and the other of average user review scores for the same games. The post was later updated, revealing that the graph with the lower average and more normal distribution was done by the paid writers. What did the author take away from all this? "The GameSpot reviewers may be more reasonable on average than I would give them credit for, but given how highly the users rate games, I'm guessing that mostly-positive reviews is really what the readers want."

The problem with letting anyone review a game is that anyone can review a game, no matter what other games they've played or experience they've had. Heck, they don't even have to have played the game in question to offer up a "10.0 BEST GAME EVER" review. In a triumph of technology over logic, some outlets even allow readers to post their thoughts on a game before it comes out, leading to countless reader "reviews" written entirely in the future tense ("This game is gonna be so awesome! The graphics look sweet and I bet the gameplay will be OK")

User reviews also suffer from an extreme self-selection bias. Most readers aren't going to take the time to post their thoughts on a game unless they really love it or they really hate it (who's going to take time out of their day to say a game is just OK?). The split between good and bad reviews isn't uniform, either... people are more likely to seek out information about games they like, and thus more likely to submit a review for that game while they're there. Thus, reader reviews tend to skew higher

If reader reviews were really useful to the average reader, professional game journalists would be out of the job. Luckily, most reader reviews aren't worth the silicon they're encoded on.

6 comments:

  1. I just finished reading James Surowiecki's _Wisdom of Crowds_, and I wonder if collectively gamers couldn't better the quality of a game than a single reviewer?

    I would suggest that most readers have accepted an inflated curve for most games, which has been no doubt imposed by game reviewers or maybe even scholastic grades. A 6 would be a really crappy game (60% - right?) and a 7 might be what should probably rank at a 5 - average.

    You're right Kyle, surely there are those readers who review with extreme prejudice. But after reading _Wisdom_ I wonder if something like the Hollywood Stock Exchange for video games couldn't help shed some light on which games will do well and which won't. My guess is that gamers would be surprisingly accurate in predicting the quality of a game.

    ReplyDelete
  2. And, I might add, in discerning the overall quality. Most reader reviews aren't worth the silicon, but collectively I think they show that they might be more accurate than Gamespot.

    ReplyDelete
  3. If you really want a good laugh, read some of the reader reviews for games that are on pre-order and haven't been fully reviewed yet.
    You'll see people insulting each other back and forth, never saying anything about the game, and giving it either the highest or lowest score possible. I generally ignore reader reviews because of this. One would think that the people who run the site would do SOMETHING to keep this from skewing the honest reviews people are putting up.
    A simple answer would be to watch for users who abuse the reviews, delete whatever they post, and ban them from posting reviews again.

    Timmay!

    ReplyDelete
  4. In reagrds to jwb's "wisdom of crowds" comment, that only works if the sample giving reviews is a totally random selection of people who have played the game. As I said in my post, the sample of people who do reader reviews skews towards people who really enjoyed the game, and also includes people who haven't played the game. Heck, by only including people who read gaming websites, you've already skewed the sample to more serious players, leaving out the "wisdom" of the casual fan.

    If you want wisodm of crowds, why not use a sources like GameRankings.com? There you know the reviewers have played the game, you know they are somewhat experienced at game playing, and you get a good sample from across different media sources.

    ReplyDelete
  5. Maybe this is related to "self-selection" as well, but I think there's another factor at work that I'll call "self-justification." When someone plunks down $50 bucks for a game, they want to feel justified in their choice and probably drive some of their self-worth from making that decision.

    Contrast the reader reviews you typically see with the scores found at Gamefly.com (where registering a score is available only to subscribers). Here the scores are usually similar to, if not lower than, GameSpots' scores, and always lower than IGN's.

    For example, let's take Mercenaries for Xbox.

    IGN Reviewer Score: 9.1
    IGN Reader Review: 9.1

    GameSpot Reviewer: 8.8
    GameSpot Reader Review: 9.1

    Gamefly.com Reader Review: 8.2

    After playing Mercenaries for a couple weeks, I'd agree with the Gamefly score.

    The main reason for the more "honest" scores is that the GameFly subscriber never has any personal stake in the game and so a low review doesn't impact them emotionally one way or another.

    --T.

    ReplyDelete
  6. This isn't some task like counting jelly beans in a jar that has a definitive right-or-wrong answer. The only reason that "wisdom of crowds" would apply is because a larger sampling of the general population would more accurately reflect the overall opinion of, well, the general population.

    ReplyDelete