OPINION: STOP RATING DEMOS, IT'S UNFAIR...TO COMPLETED GAMES
Completed Games Have More Opportunity To Make Mistakes
- Max McGee
- 11/21/2014 09:06 PM
- 22575 views
So, today I had a minor revelation in the shower, as I often do. If I play a one hour long demo, and it's really good, I am likely to review it with a high rating based on the content I've seen.
If on the other hand, I play an eight hour long completed game, that game has seven more hours of gameplay in which to screw up. Seven more hours of "rope" with which to hang itself. Seven more hours of content with which to find fault with the gameplay, story, characters, or graphics. Seven more hours to make a design decision that will piss me off. And as a result of its expanded content and additional opportunity to make mistakes, it will very probably get a lower score, every time, than an equivalent demo.
And that is not as it should be.
Surely, by now it is long-codified conventional wisdom that a game must put its best foot forward. Games in this community are typically frontloaded with their best and most polished content, because of the truism that if your game doesn't hook players in the first fifteen minutes, it won't get a chance to at all. The inverse of this is logically obvious. Maintaining a high level of quality and polish during a feature-length game is much, much harder than maintaining it for a short demo.
How does one counteract this obvious inequity? Obviously, you could add one full star for completed games, or subtract one full star for game demos to try and offset this imbalance. But doing so would clearly skew the entire numerical component of the review rubric.
A better solution is not to allow starred reviews for incomplete games (demos) at all.
Nota Bene: Obviously, until this point, I have been a part of this problem. Because I have attached starred reviews to numerous demos. I have no defense for this except to say that I had not come to this conclusion presented here--which I do believe is the correct conclusion--until just now.
But Max, a reasonable person interrupts, creators of games need feedback on their demos. LOTS of feedback!. And of course, being a creator of games myself, I 100% agree. And here we arrive at an opportunity to restate the crux of the issue:
Game Reviews on this site uneasily serve two masters. They exist simultaneously for two separate purposes:
1) To publicly evaluate the merits of a game and recommend it (or not) to an audience of potential players. This function is served primarily by the review's rating, and then by it's text.
2) To provide feedback, critique, evaluation and suggestions to the game's developer. This function is served primarily by the review's text, and then by it's rating.
Demos require the second function of a review. I would argue that they do NOT require the first function of reviews (evaluating a game for the benefit of potential players). And therefore, they do not require STARRED reviews.
The reason I would argue this is that even without starred reviews, potential players have ample means to evaluate a demo they are considering playing. Consider:
In conclusion, going forward, reviews should be allowed for game demos, but should not be allowed to have official star ratings that are aggregated and tracked on the site. Official star ratings should be reserved for completed games.
Reviewers should ABSOLUTELY be allowed to informally include star ratings for game demos in the text of their review if they desire. The stars can even be inserted as an image in the body of the text. But the review score should not be officially aggregated or tracked by the site's mechanics for reviews of incomplete games.
Reviews--with or without informal star ratings--of incomplete games (demos) should be called and handled as Critiques.
The Problem Of Implementation
Toggling off the option to officially append a starred rating for future reviews of game demos going forward should be easy. But what about the thousands of starred reviews of demos already on the site? Those present a greater challenge. I am not technically skilled and I have no real understanding of the site's codebase. But it seems to me it should be possible to use a script to find every review with a star rating attached to a game with a status other than Complete, and change all of those ratings to Not Rated. And the text of the reviews would not need to be altered at all, which is a good thing, because that could only be done manually. After all, including an informal rating or score in the body of a critique is fine, as long as the score is not tracked on the site.
The sitewide change that would happen if this were implemented is that the official score of every single game without a status of complete would change to Not Rated. Which makes actually perfect sense. IGN, Kotaku, GiantBomb, GameInformer and so on and so on and so on don't rate games before they are done, so this seems perfectly logical.
One more change would need to be made to the site's infrastructure to accommodate this, and that is in the Search Filter on the Games page tab. The Minimum Rating dropdown would have to be hidden unless Completed was selected from the Status dropdown.
Now ... sadly, I do not expect that there's terribly high chances of this positive change actually being implemented. That goes with the territory of writing an op-ed piece in any environment. Your opinion does not become law just because you've put it in writing. And besides the various and sundry objections to this suggestion I can't yet anticipate, the mere idea of making any major change to the site at all can be a difficult one to gain traction for. Change is inherently inconvenient and difficult and sometimes even scary. I am the guy who freaked out over signatures being removed six years ago, so I know.
But I look forward to the discussion this will generate, and I embrace the possibility of being pleasantly surprised by the way this idea is received.
If on the other hand, I play an eight hour long completed game, that game has seven more hours of gameplay in which to screw up. Seven more hours of "rope" with which to hang itself. Seven more hours of content with which to find fault with the gameplay, story, characters, or graphics. Seven more hours to make a design decision that will piss me off. And as a result of its expanded content and additional opportunity to make mistakes, it will very probably get a lower score, every time, than an equivalent demo.
And that is not as it should be.
Surely, by now it is long-codified conventional wisdom that a game must put its best foot forward. Games in this community are typically frontloaded with their best and most polished content, because of the truism that if your game doesn't hook players in the first fifteen minutes, it won't get a chance to at all. The inverse of this is logically obvious. Maintaining a high level of quality and polish during a feature-length game is much, much harder than maintaining it for a short demo.
How does one counteract this obvious inequity? Obviously, you could add one full star for completed games, or subtract one full star for game demos to try and offset this imbalance. But doing so would clearly skew the entire numerical component of the review rubric.
A better solution is not to allow starred reviews for incomplete games (demos) at all.
Nota Bene: Obviously, until this point, I have been a part of this problem. Because I have attached starred reviews to numerous demos. I have no defense for this except to say that I had not come to this conclusion presented here--which I do believe is the correct conclusion--until just now.
But Max, a reasonable person interrupts, creators of games need feedback on their demos. LOTS of feedback!. And of course, being a creator of games myself, I 100% agree. And here we arrive at an opportunity to restate the crux of the issue:
Game Reviews on this site uneasily serve two masters. They exist simultaneously for two separate purposes:
1) To publicly evaluate the merits of a game and recommend it (or not) to an audience of potential players. This function is served primarily by the review's rating, and then by it's text.
2) To provide feedback, critique, evaluation and suggestions to the game's developer. This function is served primarily by the review's text, and then by it's rating.
Demos require the second function of a review. I would argue that they do NOT require the first function of reviews (evaluating a game for the benefit of potential players). And therefore, they do not require STARRED reviews.
The reason I would argue this is that even without starred reviews, potential players have ample means to evaluate a demo they are considering playing. Consider:
- Demos are shorter than full games, and are less of a time investment. Evaluation is therefore de facto less of a concern than for a complete game.
- The quality of a demo can be evaluated by its screenshots and its game description.
- The quality of a demo can be further evaluated by its user comments.
- The demo itself is only a tool for evaluating/advertisement for the completed game, not a product that requires evaluation.
- Finally, "Not Rated" reviews provide both an avenue of direct feedback to the game developer and yet another option for potential players to use in evaluating a demo before deciding to play it.
In conclusion, going forward, reviews should be allowed for game demos, but should not be allowed to have official star ratings that are aggregated and tracked on the site. Official star ratings should be reserved for completed games.
Reviewers should ABSOLUTELY be allowed to informally include star ratings for game demos in the text of their review if they desire. The stars can even be inserted as an image in the body of the text. But the review score should not be officially aggregated or tracked by the site's mechanics for reviews of incomplete games.
Reviews--with or without informal star ratings--of incomplete games (demos) should be called and handled as Critiques.
The Problem Of Implementation
Toggling off the option to officially append a starred rating for future reviews of game demos going forward should be easy. But what about the thousands of starred reviews of demos already on the site? Those present a greater challenge. I am not technically skilled and I have no real understanding of the site's codebase. But it seems to me it should be possible to use a script to find every review with a star rating attached to a game with a status other than Complete, and change all of those ratings to Not Rated. And the text of the reviews would not need to be altered at all, which is a good thing, because that could only be done manually. After all, including an informal rating or score in the body of a critique is fine, as long as the score is not tracked on the site.
The sitewide change that would happen if this were implemented is that the official score of every single game without a status of complete would change to Not Rated. Which makes actually perfect sense. IGN, Kotaku, GiantBomb, GameInformer and so on and so on and so on don't rate games before they are done, so this seems perfectly logical.
One more change would need to be made to the site's infrastructure to accommodate this, and that is in the Search Filter on the Games page tab. The Minimum Rating dropdown would have to be hidden unless Completed was selected from the Status dropdown.
Now ... sadly, I do not expect that there's terribly high chances of this positive change actually being implemented. That goes with the territory of writing an op-ed piece in any environment. Your opinion does not become law just because you've put it in writing. And besides the various and sundry objections to this suggestion I can't yet anticipate, the mere idea of making any major change to the site at all can be a difficult one to gain traction for. Change is inherently inconvenient and difficult and sometimes even scary. I am the guy who freaked out over signatures being removed six years ago, so I know.
But I look forward to the discussion this will generate, and I embrace the possibility of being pleasantly surprised by the way this idea is received.
Posts
I agree, for people like you and me, stars are going to be a factor - but not the only one, which is what really counts in the end. If the title, screenshot and snippet are interesting enough you'll take a further look, see the gamepage, perhaps browse the reviews to see why there's such a contrast between the score and the apparent value. People who decide based on stars *only* are probably those who never opened a novel in a bookstore just because it looked interesting.
Anyway people are angry at rating systems all over the internet and yet if you allow enough people to cast a vote rather than try to limit it, scores can be quite robust (even to unfair attacks) across various systems and especially in terms of relative ranking.
Anyway people are angry at rating systems all over the internet and yet if you allow enough people to cast a vote rather than try to limit it, scores can be quite robust (even to unfair attacks) across various systems and especially in terms of relative ranking.
I don't think anyone would go for this, but I think I'd like a system on RMN where a score wasn't displayed for a game until it had at least, say, five reviews. (Individual reviews would have their scores displayed normally but the game as a whole would show as Not Yet Rated until total reviews = N, where N is 4 or 5 or whatever.)
Unfortunately, far too few people review games for this to be practical.
Unfortunately, far too few people review games for this to be practical.