The Death of Honest Review Scores
My parents need a new security suite for their computer. They’re reasonably computer savvy but still asked for my help in picking one. Since I’m a Linux/Mac kind of guy, I decided to do a few searches to see what was new and ran across PC World’s chart of Internet security suites.
PC World is a magazine I have a lot of respect for. They do a lot of great work in the field of reviewing computer products. But I noticed something when I looked at the bottom of the chart. Trend Micro got the lowest score on the list and their “Bottom Line” review reads as follows:
“Trend Micro’s latest suite fails at the most basic task of detecting and blocking malicious software. Not recommended.”
Ouch, seems pretty scathing to me. But then look at the score. It’s a 74/100. On a regular grading scale that would be a middle “C”. On a 1-10 system it would be a 7.5, well above an “average” of 5.
How the Hell do you say something “fails at the most basic task” and then give it an average or even good score? That makes no sense. The full review isn’t much more kind saying that:
“Trend Micro’s suite has some good points, but there’s no getting around the fact that Internet Security Pro 2009 fails at detecting malicious software, and therefore fails as a security program. We cannot recommend buying it.”
Clearly something is wrong with review scores when a 74 is a total failure of a program. It’s time we woke up a little bit and realized that review scores on the Web are garbage and reliance on them is dangerous, especially when dealing with security software.
However, this isn’t a new problem, it’s been going on for quite some time.
A Bit of History
I’m not the first to have noticed this strange problem. About two years ago, Destructoid made a similar observation as it relates to video games and lamented how many assume anything under a “5″ might as well be a “1″.
Our brains have been trained to assume that a review of a five is a slam, an insult. Even though, mathematically, it should be the most average, non-offensive of the reviews possible. If I give a game/application/movie/product/etc a review of a 5/10, what I should be saying is that it is mediocre, middle of the pack but what I am really saying, at least in most people’s minds, is that it stinks, it’s lousy.
This is part of the reason why RottenTomatoes, according to Destructoid, had to raise their “fresh” rating threshold on video games from 60% to 70%, because too many bad games were getting highly reviewed.
It’s obvious that online reviews are skewed toward the positive, the likely exception being Web hosting (have you SEEN those reviews?), but the real question is why?
As a person who regularly does reviews, I lament and loathe having to give a numerical score. First, nothing is more frustrating than having to condense a 1000 word review into a single digit. Second, I find myself giving average scores with a very high level of frequency. For example, on Kongregate, I find myself giving a review of 3/5 about half of the time with 2s and 4s making up most of the rest (a slight lean to 4 though as I do tend to play games that are already highly reviewed).
So, rather than play the “scoring” game, I usually skip such simplified reviewing. However, the few times I have done so I have noticed a few things.
- It is Easy to Find Something You Like: Very few things are so bad that you can’t find at least one thing that you like. As humans, when thinking back on something (and all reviews really are reflections) it’s easier to think about the good than the bad. As such, even when we didn’t like something, we don’t want to discredit the good that it did bring.
- Positive Reviews Get More Attention: If you want to get a lot of links and traffic to your post, you do much better with positive reviews. Sure, snarky over-the-top negative reviews get a lot of attention as well, but positive reviews that make people want to try the product rated generate the most attention overall. Mediocre reviews, no matter how honest, get looked over.
- We Self-Adjust: Subconsciously, we all know that online reviews are skewed so we tend to skew our own to match. Whether it is an attempt to make sure we are not misunderstood or just mimicking what we see elsewhere, we do it. It’s that simple and it creates the cycle of destruction that we see.
However, why we are where we are is not important. What is more of an urgent matter is what do we do about it? After all, if we can’t stop the spiraling upward reviews, soon we’ll have nowhere higher to go up.
Fixing the Problem
When it comes to dealing with this issue, there are three solutions that I see.:
- Forgo Any Kind of Rating: This one is simple, you write or produce your review, say what’s good and what’s bad and then leave it there. It’s a system adopted by Yahtzee of Zero Punctuation fame and works well for many products. The only people that are hurt are those that skim reviews looking for the final verdict. But there’s nothing stopping you from creating a short “Conclusions” section, like I seem to do on every stupid thing I write…
- Use Non-Numeric Ratings: The whole numeric rating system doesn’t make a great deal of sense anyway. What am I supposed to do with this information? If a product has below a five should I forget it? Five to seven get it on sale? Seven to nine buy it? Nine to ten step over my own mother to get it? Why not do like Screwattacks VGR series and make a real suggestion, they condense every review to “F’ It”, “Rent it” or “Buy It”. That’s useful information. Sure, with no standard system you can’t take an average, but the averages we have today are broken so that’s no real loss.
- Reset the Scale: Do a manual reboot of the system and check and make sure that reviews are centered on the actual middle of the range that is chosen. It’s not easy and it will make comparisons with old reviews impossible, but it’s necessary from time to time to beat the scores back down before they make like Spinal Tap and take it to 11.
Which of these will work? Probably none. No one seems really convinced that this is a problem worth talking about, but if we continue this current pace, in a few years, anything below an eight will be deemed “bad”. At that point, all credibility will have gone out the window…
The bottom line is this, review scores are hopelessly inflated for most products on the Web and have reached a point where they are meaningless. If you want to know if a product rocks or sucks, the best thing you can do is actually read the review. Most people will tell you even if their scores don’t.
Sure, reading something can be hard when you have a pretty, shiny, simple score to look at, but it really is necessary. Even if the scale is fair, there’s no way to condense a complicated opinion to just a single numerical value.
Trust me, I’ve tried.
The reviewer took the time to write out his or her thoughts and experiences, the least we can do is take the few minutes required to read it.
It at least beats blindly complaining about how wrong the score was.