OPINION: STOP RATING DEMOS, IT'S UNFAIR...TO COMPLETED GAMES
Completed Games Have More Opportunity To Make Mistakes
- Max McGee
- 11/21/2014 09:06 PM
- 22559 views
So, today I had a minor revelation in the shower, as I often do. If I play a one hour long demo, and it's really good, I am likely to review it with a high rating based on the content I've seen.
If on the other hand, I play an eight hour long completed game, that game has seven more hours of gameplay in which to screw up. Seven more hours of "rope" with which to hang itself. Seven more hours of content with which to find fault with the gameplay, story, characters, or graphics. Seven more hours to make a design decision that will piss me off. And as a result of its expanded content and additional opportunity to make mistakes, it will very probably get a lower score, every time, than an equivalent demo.
And that is not as it should be.
Surely, by now it is long-codified conventional wisdom that a game must put its best foot forward. Games in this community are typically frontloaded with their best and most polished content, because of the truism that if your game doesn't hook players in the first fifteen minutes, it won't get a chance to at all. The inverse of this is logically obvious. Maintaining a high level of quality and polish during a feature-length game is much, much harder than maintaining it for a short demo.
How does one counteract this obvious inequity? Obviously, you could add one full star for completed games, or subtract one full star for game demos to try and offset this imbalance. But doing so would clearly skew the entire numerical component of the review rubric.
A better solution is not to allow starred reviews for incomplete games (demos) at all.
Nota Bene: Obviously, until this point, I have been a part of this problem. Because I have attached starred reviews to numerous demos. I have no defense for this except to say that I had not come to this conclusion presented here--which I do believe is the correct conclusion--until just now.
But Max, a reasonable person interrupts, creators of games need feedback on their demos. LOTS of feedback!. And of course, being a creator of games myself, I 100% agree. And here we arrive at an opportunity to restate the crux of the issue:
Game Reviews on this site uneasily serve two masters. They exist simultaneously for two separate purposes:
1) To publicly evaluate the merits of a game and recommend it (or not) to an audience of potential players. This function is served primarily by the review's rating, and then by it's text.
2) To provide feedback, critique, evaluation and suggestions to the game's developer. This function is served primarily by the review's text, and then by it's rating.
Demos require the second function of a review. I would argue that they do NOT require the first function of reviews (evaluating a game for the benefit of potential players). And therefore, they do not require STARRED reviews.
The reason I would argue this is that even without starred reviews, potential players have ample means to evaluate a demo they are considering playing. Consider:
In conclusion, going forward, reviews should be allowed for game demos, but should not be allowed to have official star ratings that are aggregated and tracked on the site. Official star ratings should be reserved for completed games.
Reviewers should ABSOLUTELY be allowed to informally include star ratings for game demos in the text of their review if they desire. The stars can even be inserted as an image in the body of the text. But the review score should not be officially aggregated or tracked by the site's mechanics for reviews of incomplete games.
Reviews--with or without informal star ratings--of incomplete games (demos) should be called and handled as Critiques.
The Problem Of Implementation
Toggling off the option to officially append a starred rating for future reviews of game demos going forward should be easy. But what about the thousands of starred reviews of demos already on the site? Those present a greater challenge. I am not technically skilled and I have no real understanding of the site's codebase. But it seems to me it should be possible to use a script to find every review with a star rating attached to a game with a status other than Complete, and change all of those ratings to Not Rated. And the text of the reviews would not need to be altered at all, which is a good thing, because that could only be done manually. After all, including an informal rating or score in the body of a critique is fine, as long as the score is not tracked on the site.
The sitewide change that would happen if this were implemented is that the official score of every single game without a status of complete would change to Not Rated. Which makes actually perfect sense. IGN, Kotaku, GiantBomb, GameInformer and so on and so on and so on don't rate games before they are done, so this seems perfectly logical.
One more change would need to be made to the site's infrastructure to accommodate this, and that is in the Search Filter on the Games page tab. The Minimum Rating dropdown would have to be hidden unless Completed was selected from the Status dropdown.
Now ... sadly, I do not expect that there's terribly high chances of this positive change actually being implemented. That goes with the territory of writing an op-ed piece in any environment. Your opinion does not become law just because you've put it in writing. And besides the various and sundry objections to this suggestion I can't yet anticipate, the mere idea of making any major change to the site at all can be a difficult one to gain traction for. Change is inherently inconvenient and difficult and sometimes even scary. I am the guy who freaked out over signatures being removed six years ago, so I know.
But I look forward to the discussion this will generate, and I embrace the possibility of being pleasantly surprised by the way this idea is received.
If on the other hand, I play an eight hour long completed game, that game has seven more hours of gameplay in which to screw up. Seven more hours of "rope" with which to hang itself. Seven more hours of content with which to find fault with the gameplay, story, characters, or graphics. Seven more hours to make a design decision that will piss me off. And as a result of its expanded content and additional opportunity to make mistakes, it will very probably get a lower score, every time, than an equivalent demo.
And that is not as it should be.
Surely, by now it is long-codified conventional wisdom that a game must put its best foot forward. Games in this community are typically frontloaded with their best and most polished content, because of the truism that if your game doesn't hook players in the first fifteen minutes, it won't get a chance to at all. The inverse of this is logically obvious. Maintaining a high level of quality and polish during a feature-length game is much, much harder than maintaining it for a short demo.
How does one counteract this obvious inequity? Obviously, you could add one full star for completed games, or subtract one full star for game demos to try and offset this imbalance. But doing so would clearly skew the entire numerical component of the review rubric.
A better solution is not to allow starred reviews for incomplete games (demos) at all.
Nota Bene: Obviously, until this point, I have been a part of this problem. Because I have attached starred reviews to numerous demos. I have no defense for this except to say that I had not come to this conclusion presented here--which I do believe is the correct conclusion--until just now.
But Max, a reasonable person interrupts, creators of games need feedback on their demos. LOTS of feedback!. And of course, being a creator of games myself, I 100% agree. And here we arrive at an opportunity to restate the crux of the issue:
Game Reviews on this site uneasily serve two masters. They exist simultaneously for two separate purposes:
1) To publicly evaluate the merits of a game and recommend it (or not) to an audience of potential players. This function is served primarily by the review's rating, and then by it's text.
2) To provide feedback, critique, evaluation and suggestions to the game's developer. This function is served primarily by the review's text, and then by it's rating.
Demos require the second function of a review. I would argue that they do NOT require the first function of reviews (evaluating a game for the benefit of potential players). And therefore, they do not require STARRED reviews.
The reason I would argue this is that even without starred reviews, potential players have ample means to evaluate a demo they are considering playing. Consider:
- Demos are shorter than full games, and are less of a time investment. Evaluation is therefore de facto less of a concern than for a complete game.
- The quality of a demo can be evaluated by its screenshots and its game description.
- The quality of a demo can be further evaluated by its user comments.
- The demo itself is only a tool for evaluating/advertisement for the completed game, not a product that requires evaluation.
- Finally, "Not Rated" reviews provide both an avenue of direct feedback to the game developer and yet another option for potential players to use in evaluating a demo before deciding to play it.
In conclusion, going forward, reviews should be allowed for game demos, but should not be allowed to have official star ratings that are aggregated and tracked on the site. Official star ratings should be reserved for completed games.
Reviewers should ABSOLUTELY be allowed to informally include star ratings for game demos in the text of their review if they desire. The stars can even be inserted as an image in the body of the text. But the review score should not be officially aggregated or tracked by the site's mechanics for reviews of incomplete games.
Reviews--with or without informal star ratings--of incomplete games (demos) should be called and handled as Critiques.
The Problem Of Implementation
Toggling off the option to officially append a starred rating for future reviews of game demos going forward should be easy. But what about the thousands of starred reviews of demos already on the site? Those present a greater challenge. I am not technically skilled and I have no real understanding of the site's codebase. But it seems to me it should be possible to use a script to find every review with a star rating attached to a game with a status other than Complete, and change all of those ratings to Not Rated. And the text of the reviews would not need to be altered at all, which is a good thing, because that could only be done manually. After all, including an informal rating or score in the body of a critique is fine, as long as the score is not tracked on the site.
The sitewide change that would happen if this were implemented is that the official score of every single game without a status of complete would change to Not Rated. Which makes actually perfect sense. IGN, Kotaku, GiantBomb, GameInformer and so on and so on and so on don't rate games before they are done, so this seems perfectly logical.
One more change would need to be made to the site's infrastructure to accommodate this, and that is in the Search Filter on the Games page tab. The Minimum Rating dropdown would have to be hidden unless Completed was selected from the Status dropdown.
Now ... sadly, I do not expect that there's terribly high chances of this positive change actually being implemented. That goes with the territory of writing an op-ed piece in any environment. Your opinion does not become law just because you've put it in writing. And besides the various and sundry objections to this suggestion I can't yet anticipate, the mere idea of making any major change to the site at all can be a difficult one to gain traction for. Change is inherently inconvenient and difficult and sometimes even scary. I am the guy who freaked out over signatures being removed six years ago, so I know.
But I look forward to the discussion this will generate, and I embrace the possibility of being pleasantly surprised by the way this idea is received.
Posts
As an afterthought:
I guess one way to try to go around it would be to further divide the first purpose you mention into two different aims: evaluation and diffusion, the first being less relevant for demos while the second one is actually important for feedback (and hype building I guess). But even then, it's not completely obvious.
"Pure" product evaluation is well separated from feedback for AAA games because there is no overlap between consumers and contributors, and few consumers are interested in versions < 1.0. It's less convincing for our community, or even for indie games in general (and open software too).
Another equivalent could be found in episodic material, including games that are released that way. Reviews for episodes in TV series cannot work exactly like movie reviews, especially as the TV series may very well be discontinued by producers before their intended conclusion, and it gets worse when the feedback loop between the making and the reviewing is even stronger, like in our format.
I guess one way to try to go around it would be to further divide the first purpose you mention into two different aims: evaluation and diffusion, the first being less relevant for demos while the second one is actually important for feedback (and hype building I guess). But even then, it's not completely obvious.
"Pure" product evaluation is well separated from feedback for AAA games because there is no overlap between consumers and contributors, and few consumers are interested in versions < 1.0. It's less convincing for our community, or even for indie games in general (and open software too).
Another equivalent could be found in episodic material, including games that are released that way. Reviews for episodes in TV series cannot work exactly like movie reviews, especially as the TV series may very well be discontinued by producers before their intended conclusion, and it gets worse when the feedback loop between the making and the reviewing is even stronger, like in our format.
author=Hasvers
they [unstarred reviews] are good enough to find good games too and stars are entirely irrelevant, complete game or not.
For the most part, this is correct.
Both you and I probably belong to the category of people who actually read reviews, which is not the entire population. That being said, stars do help detect really good or bad games and the measure of consensus about them in a single glance. All kinds of unfairness that may result from a scoring system can plausibly be avoided altogether by a very simple measure: the appropriate filtering of games (by genre, completion, length, good and bad traits...) so that those very different items are actually not in competition for our attention.
Overall, I could get behind an option for each game (whatever its state of completion) to make its score private or public, though I really think we give such a high weight to scores only because of years of Pavlovian conditioning by a terrible school system :P
Overall, I could get behind an option for each game (whatever its state of completion) to make its score private or public, though I really think we give such a high weight to scores only because of years of Pavlovian conditioning by a terrible school system :P
Oh, don't get me wrong. Scores, in general, bother me. Look at one of my own games and you have two people who thought it was stellar and amazing and one person who thought it was crap. The third person also thought that the critically acclaimed Grave Spirit and the wildly popular Legacies of Dondoran deserved 1.5 Star reviews, so clearly my vicious panning at least put me in august company.
The way that review averaging works means that because none of the scores are weighted, the game winds up with an aggregated score that is 'slightly above average'. I should be happy enough with this but...the truth is, two of the people who bothered to review the game thinking it was great and the other one person who bothered to review the game thinking it was crap doesn't mean the game is, in some objective sense, slightly above average. It doesn't mean anything except that two of the people who bothered to review the game thought it was great and the other one person who bothered to review the game thought it was crap. Basically, scores are nothing but a useful evaluative shorthand. They're not a "true measure of a game's worth" whatever that is. I don't think scores are problematic in and of themselves, I think it's the weight that other factors, like the site's structural interface/design and our own psychological design, give to these scores that creates a problem.
An interesting idea I just had is what if instead of using an average review score, when there was more than one starred review, the site staff picked one of the reviews to be the "featured review" for that game and used the star score of that one to display. This would obviously set one person's opinion above all others, and the staff would make a judgement call based on the quality of the review. I don't know, this obviously opens a whole can of worms on its own and I thought of about half a dozen problems with it just writing this paragraph, so I'm not actually advocating for it I'm just thinking out loud.
I don't think that getting rid of stars for everything entirely is feasible because honestly when you search the site for a game to play as a lot of our 'silent user majority' do, one of the parameters you want to be able to filter is 'overall goodness of game'. And let's say you arbitrarily set the minimum rating you were interested in to Four Stars (****), well even if you were looking for a science fiction RPG you'd automatically miss out on my game which one person thought was a 4.5 and another person thought was a 5 just because one other person thought it was a 1.5. So yeah the system as it exists is not awesome but abolishing stars entirely is not a practical solution because end-users wanting to search games by rating is a reasonable request. Hmm...
I think the idea of a game creator being able to opt out of displaying its score is fascinating...but I do worry about how the public would perceive games with 'hidden' scores, and this idea opens up its own whole host of issues.
But all that is a bigger and thornier issue than the question of whether or not to disable scores just for reviews of incomplete games specifically. I do have specific arguments I want to make here--namely an argument that unstarred reviews are "good enough" for end user evaluation of demos but not complete games--but it's going to have to wait until I am substantially less tired.
The way that review averaging works means that because none of the scores are weighted, the game winds up with an aggregated score that is 'slightly above average'. I should be happy enough with this but...the truth is, two of the people who bothered to review the game thinking it was great and the other one person who bothered to review the game thinking it was crap doesn't mean the game is, in some objective sense, slightly above average. It doesn't mean anything except that two of the people who bothered to review the game thought it was great and the other one person who bothered to review the game thought it was crap. Basically, scores are nothing but a useful evaluative shorthand. They're not a "true measure of a game's worth" whatever that is. I don't think scores are problematic in and of themselves, I think it's the weight that other factors, like the site's structural interface/design and our own psychological design, give to these scores that creates a problem.
An interesting idea I just had is what if instead of using an average review score, when there was more than one starred review, the site staff picked one of the reviews to be the "featured review" for that game and used the star score of that one to display. This would obviously set one person's opinion above all others, and the staff would make a judgement call based on the quality of the review. I don't know, this obviously opens a whole can of worms on its own and I thought of about half a dozen problems with it just writing this paragraph, so I'm not actually advocating for it I'm just thinking out loud.
I don't think that getting rid of stars for everything entirely is feasible because honestly when you search the site for a game to play as a lot of our 'silent user majority' do, one of the parameters you want to be able to filter is 'overall goodness of game'. And let's say you arbitrarily set the minimum rating you were interested in to Four Stars (****), well even if you were looking for a science fiction RPG you'd automatically miss out on my game which one person thought was a 4.5 and another person thought was a 5 just because one other person thought it was a 1.5. So yeah the system as it exists is not awesome but abolishing stars entirely is not a practical solution because end-users wanting to search games by rating is a reasonable request. Hmm...
I think the idea of a game creator being able to opt out of displaying its score is fascinating...but I do worry about how the public would perceive games with 'hidden' scores, and this idea opens up its own whole host of issues.
But all that is a bigger and thornier issue than the question of whether or not to disable scores just for reviews of incomplete games specifically. I do have specific arguments I want to make here--namely an argument that unstarred reviews are "good enough" for end user evaluation of demos but not complete games--but it's going to have to wait until I am substantially less tired.
There's also the issue that different users are going to offer wildly varying levels of criticism or tolerance.
There are people who can play through FFX without being annoyed by the UI or the voice acting, because they are either unfazed by such things or because they derive enough enjoyment elsewhere (cough blitzball cough) to overlook the game's flaws. Scores are probably even more flawed, in that regard.
Perhaps reviews should be given a 200-character tagline, much like game profiles have.
There are people who can play through FFX without being annoyed by the UI or the voice acting, because they are either unfazed by such things or because they derive enough enjoyment elsewhere (cough blitzball cough) to overlook the game's flaws. Scores are probably even more flawed, in that regard.
Perhaps reviews should be given a 200-character tagline, much like game profiles have.
author=LouisCyphre
There are people who can play through FFX without being annoyed by the UI or the voice acting, because they are either unfazed by such things or because they derive enough enjoyment elsewhere (cough blitzball cough) to overlook the game's flaws. Scores are probably even more flawed, in that regard.
Maybe that's because the inclusion of blitzball was a great design decision?
My solution is that when I look for games I look at all that games' reviews and who they are written by. If those same reviewers share a similar opinion of other games I liked, or if they are devs of games I enjoyed, then I will listen. Otherwise I'll read the review and make a judgment on whether the review makes good points or not. E.g. at this point, I will listen to anything that calunio says because he has very similar taste to me, and his writing is impeccable and enjoyable. Nhubi has extremely well-written reviews, even if I don't always agree with her. She tends to be very harsh on quite a few games, but the opposition is always very well explained and delivered. Addit gives very well-structured reviews that I don't always agree with either, but you can always tell that he has put a lot of effort into making them great.
I myself try to be a very fair reviewer, even when it comes to scoring demos. Often the only demos I score are the ones that people request me to review. If I feel like I am unequipped to give criticism, I will give a N/A score, and then re review it when the game is complete.
What I'm trying to say is, rather than make a blanket rule like "no scores for incomplete games", I think it would be nice to give reviewers more freedom rather than less. I think that fostering a positive review culture would be a solution. More comments on those reviews saying they're great, people! And when reviewers have been unnecessarily harsh, try to say so without exacerbating things.
I don't know, I'd like to think of it as a really fluid process that doesn't need these blanketing rules suppressing critical freedom. Next thing people will be calling to outlaw negative criticism. Sometimes you need the stinging flies to keep the horse alive. Otherwise its just a circle jerk of positive comments.
author=CashmereCat
Maybe that's because the inclusion of blitzball was a great design decision?
That's exactly why.
I'd play an FFX-0 that starred Jecht, Braska, and young Auron. SQEX, fund it.
Nhubi has extremely well-written reviews, even if I don't always agree with her. She tends to be very harsh on quite a few games, but the opposition is always very well explained and delivered.
Nhubi is a weird case man. I haven't done like an extensive reading but I am pretty sure I agree with her relative ranking of almost all games...I'd just give 0.5-1.5 more stars to just about ALL of them compared to her ratings. So I've kind of like mentally added one star to all of her ratings to make her reviews more useful for me, lol.
I think that goes with the general idea that having a slightly harsh but fair ranking system helps people to differentiate between the good and the bad. I like the idea of a harsh reviewer because it means that if you got a high score it means that much more. E.g. if nhubi gives a game a 4.5 I'll be like Holy Schmores this game must be fantastic I gotta try this right away. Whereas if some dude gives 4s and 5s to every game (even I'm a little guilty of this) then his rating... Means less, somehow. As a reviewer, I am often scared to give a game a low rating review. I only do it if I feel its warranted. But even then I feel bad, just because someone might have put years of effort into it. Yet at the same time I want to tell the truth and be unbiased. Its a tough battle.
Actually looking through all of her reviews the other day I noticed that nhubi had given almost NO games a rating of 4+, possibly actually none (I don't recall at the moment). I remember thinking... "MAN, what is she LOOKING for? doesn't she love ANYTHING?"
oh well. /tangent
oh well. /tangent
game demos that receive sub-par reviews (aka a 3.5 or less) are much less likely to get attention because of the pretentious twat who decided it would be appropriate to rate an unfinished product, and people won't put time into mediocre material. and its not like most users who review games on this site are active enough (or active long enough) to see the game finished and are willing to re-review it.
that sort of risk discourages demo releases imo, which makes the feedback process needlessly difficult to go through
that sort of risk discourages demo releases imo, which makes the feedback process needlessly difficult to go through
author=Adon237
that sort of risk discourages demo releases imo, which makes the feedback process needlessly difficult to go through
Thank you for putting what I wanted to say in to words.
@Adon237 What if the dev asks for a review? Or what if the reviewer states that they will reevaluate the score if a new demo is released, all they need to do is PM the reviewer?
Not only can ANYONE review a game given they pass the submission guidelines, not all reviewers are active or check this website ever again after a certain amout of time, and many reviewers aren't going to be that flexible... that doesn't make the situation any better.
If the developer asks for a review with a star rating, they're going to have to accept it because they brought that upon themselves. I'm not saying games shouldn't be reviewed, but perhaps giving the developer the option to stop reviews from happening until the game is complete or whatever would be a good idea? I don't know what would work best though.
If the developer asks for a review with a star rating, they're going to have to accept it because they brought that upon themselves. I'm not saying games shouldn't be reviewed, but perhaps giving the developer the option to stop reviews from happening until the game is complete or whatever would be a good idea? I don't know what would work best though.
Adon, you are assuming that people who don't even read reviews before ignoring a 3.5 (or 2.5, for that matter) star game would give relevant feedback on a demo. I'm not really convinced by this.
It's generally easy enough, from the title and the first few lines of a review, to know whether A) the demo might be worth trying after all (because it caters to my tastes if not to the reviewer's) or B) the reviewer is just being pretentious or hateful.
This minimal effort is the difference between random visitors who are pure consumers and care only about stars, and community members who may actually be helpful critiques. Apart from some hurt feelings (which are a factor, sure), I have yet to see how the system is so broken.
It's generally easy enough, from the title and the first few lines of a review, to know whether A) the demo might be worth trying after all (because it caters to my tastes if not to the reviewer's) or B) the reviewer is just being pretentious or hateful.
This minimal effort is the difference between random visitors who are pure consumers and care only about stars, and community members who may actually be helpful critiques. Apart from some hurt feelings (which are a factor, sure), I have yet to see how the system is so broken.
I don't know what kind of person is automatically ignoring all games with reviews of 3.5 Stars or less but I'm gonna go with an asshole? Yeah, let's go with an asshole. Fuck this person.
author=Max McGee
I don't know what kind of person is automatically ignoring all games with reviews of 3.5 Stars or less but I'm gonna go with an asshole? Yeah, let's go with an asshole. Fuck this person.
This is pretty dismissive if this hypothetical person does not look too much into how the site is built or how it works and just wants to play some games and not risk time wasted. Much like how a movie buff does not use rotten tomatoes as guide and searches movies in more advanced criteria. Not everyone can simply do their part and become useful players.
I guess that's kind of a valid point...it still pisses me off that anyone might reject games with such a broad brush, but you make a good point.
Can I be completely honest here and say that stars matter to me. I place great importance in a rating scale, and when I see a game with 2.5 or less, I automatically think "it's probably not that much good of a game", whereas if I see one with 4 stars or more, I think "this will probably be a good game". I say this because it's generally true. People will give more stars in general to a game that's better. It doesn't always follow this rule, but generally, stars have been a good indicator into leading me to good/bad games, along with things like the quality of the gamepage, which usually indicates how much effort went into it, and the kinds of things to expect. Stars matter, but I think people make a judgement call. For example, if a game has 2.5 stars, but the gamepage looks amazing, I might still try it out, because I might just think that that reviewer doesn't like the same things I like.