Why beer ratings are great & awards are overrated

modern times beer logoThis editorial was contributed by Jacob McKean, founder of Modern Times Beer, a brewery-in-planning in San Diego. Previously, he was Communications Specialist at Stone Brewing Co., and he has also worked as a freelance writer for a smattering of beer publications. He’s a beer geek and longtime reader of this site. You can follow the development of Modern Times Beer at moderntimesbeer.com and on Facebook and Twitter.

Something has always bothered me about beer industry gatherings: the inevitable negative comments about beer ratings. Put enough brewers in a room together, crack open a few beers, and without fail, someone will bemoan the unfair, unqualified reviews of their beers on ratings sites.

At least that’s been my experience. I’ve only been in the industry for a few years, but it’s happened enough times that I feel compelled to offer a counterpoint. Personally, I can’t wait for people to start reviewing my beers online.

First, let me offer a big ol’ qualification: beer reviews and ratings websites are far from perfect. There are all kinds of ways in which they are suboptimal: imperfect tasting conditions, geographical bias, a ticker mentality that favors novelty over consistency, raters who use the sites as personal tasting archives, a narrow-minded preference for sledgehammer-type beers, and the list goes on.

But, quite simply, there is no such thing as a perfect ratings system. And on the whole, beer ratings—which are dominated by the sites BeerAdvocate and RateBeer—are a good indication of a beer’s quality more often than not.

So please: bring on the ratings of Modern Times beers (as soon as they exist). I want doctors in Ohio, lawyers in Orange County, and accountants in Louisiana to taste my beer in front of a computer and publish their judgments for all the world to read.

It’s not that I’m hungry for someone to unfairly compare my beer to someone else’s, or to thrash my beer based on an ancient, ill-treated sample. It’s that I know, on the whole, the average of all those ratings will eventually be the best, most qualified feedback I will receive about my beer.

I will get to read the judgments of people who, in many cases, have tried a tremendous amount of craft beer, know a great deal about brewing, and also—crucially—have no investment in whether or not I succeed. They will give their unvarnished, aficionado’s opinion—unlike the friends, family, and employees who are constantly saying that all of our beer is brilliant.

Also, let’s take a moment to consider how remarkable it is that anyone is passionate enough about what we make to write lengthy, detailed reviews of it. There is simply no denying that beer ratings websites have played a crucial role in popularizing craft beer and putting our industry on the cusp of mainstream success. A brewery like mine can plausibly enter the industry with a marketing budget of zero in large part due to the efforts of passionate beer geeks who spread the good word about good beer.

So yes, there will be times when beer reviews will be maddening and beer geeks will be frustrating. That comes with the territory. But we should at least acknowledge that the imperfect system of online beer ratings has made us better and stronger.

It’s also worth considering the alternative: wine ratings. Instead of the occasional off-base review on ratings websites, imagine if our livelihoods as brewers were determined by one or two self-proclaimed super tasters whose personal biases dictated the commercial success of any given beer. By comparison, the profoundly democratic nature of beer ratings are a model of fairness.

But what about competitions, you say? Surely medals awarded by learned judges are a better determinant of quality than the drunken musings of 400-pound men sweating over keyboards in their mothers’ fetid basements.

In the 7 years I’ve been involved in craft beer, I have never once looked at the list of winners in a beer competition and thought to myself, “These are clearly—or even arguably—the best beers in their styles, in more than a few cases.”

In fact, the winners of beer competitions often seem nearly random. That is because, in all likelihood, they are.

Wine competitions—despite employing rigorously trained judges, controlled environments, and blind tastings—distribute medals essentially at random. The chances of a wine winning a gold medal in any given competition is 9%, no matter how good the wine or how prestigious the competition.

Although a similar study hasn’t been done on beer competitions, there’s no reason to believe the results would be much different. And my experience indicates that they are not.

Every time I’ve gone to the GABF, I race around to taste the winners after the awards have been announced. Obviously I don’t get to try all of them or form my judgment in the same environment as the judges, but without fail, the exercise just leads me to confusion, not affirmation. Mirroring the results of the wine study, there are always a few clearly deserving winners—which has the unfortunate result of validating the whole exercise in the minds of most people—but many that are not.

That is, of course, a subjective opinion, but enough people whose views I trust—many of whom stand to gain a great deal by winning an award—agree with me to make me think that beer competitions are exactly like wine competitions: more or less random.

I may be wrong. Perhaps a rigorous scientific study of beer competitions will show that they are totally unlike wine competitions. But I suspect that they are not.

For that reason, I will not be spending thousands of dollars on entry fees for my own beers, and I will not be waiting breathlessly to hear the results of the GABF judging or the World Beer Cup.

But I am truly very excited to read the first online review of my beer. Even if I disagree with it.

email newsletter signup box anonymous tip form

20 thoughts on “Why beer ratings are great & awards are overrated

  1. Great article about reviews, but have to disagree with awards. They are random because it’s a blind tasting, they almost always are excellent beers for the styles aside from maybe one out of ten, I think *most BJCP’s get it right. I also think awards help identify the new up and coming breweries and allow them equal recognition.

  2. “400-pound men sweating over keyboards in their mothers’ fetid basements”

    Come on man, why the hate?

  3. @Jmac: I was being facetious. That line was meant to satirize the way criticism of beer geeks is usually phrased, by dismissing them based on obsessiveness or stylistic issues. The rest of the article is a full-throated defense of beer geeks, so it wouldn’t really make sense for that line to be read seriously.

  4. Totally agree with you, there are great reasons to think that awards are a complete crock, and that’s ignoring that almost none pass the “smell test” of whether they make a reasonable list.

  5. Great article and fun topic to discuss. I have to disagree with you a bit here though. Both systems have obvious flaws, but I think you understate how big the above mentioned flaws are with review sites. I love review sites and agree with you that they are good for the biz, but I would much rather be awarded a GABF then have a beer crack the BA Top 250. Does this mean that I care about a medal more than the collective voice of consumers? No, because neither BJCP judges or BA’s represent a fraction of 1% of who my consumers are. The reason I would take the medal is because I do believe they are by far the most accurate form of determining how well a beer represents its style. As you know, BJCP judges are experts in their field, especially the ones doing final rounds of reviewing and best of shows. Does losing at a competition mean that your beer isn’t worthy? Of course not, it may not fit a style or may have been overlooked. But I believe if you do win, it does prove you are worthy, maybe not the best beer of that style but you have to be put in the discussion. I believe this because when the GABF awards come out, I’m rarely in disagreement of the beers on the list that I’ve tried. Meaning, I don’t think they are necessarily the best I’ve had, but if I was judging them I would also give them respectable scores. Firestone-Walker and other world class breweries that clean up at these events time and time again show that judges aren’t simply pulling names out of hats.

    Best of luck to your brewery and congrats on the early success. I’ll definitely be stopping by soon when in San Diego.

  6. Jake – I really dig this article. If I wasn’t a regular happily married guy working for a living, had another 120 pounds on me and my mother had a basement I’d definitely get started on making it fetid.

    I have similar feelings when I review the winners of GABF and other events so I decided to start my own award. I have no idea if an award that judges beers of different styles against each other has any more or less weight than beers judged within the walls of style guidelines at a festival but I’m excited to see how it pans out.

    I’m looking forward to your launch, trying the beers I’ve been reading/drooling about and typing a slightly biased review that will most likely be lost in a swarm of Untappd check ins.

  7. To me, craft beer is a form of art. Art is not “scored.” You can talk about how much or how little you personally like an example of art but to give it some rather baseless, yet seemingly definitive score (in the eyes of the beerholder) seems to miss the mark.

    To drink a beer is to assess how it matches up to your personal taste preferences, not to give it a number score meant to represent “good” or “bad”, or “OK.”

    Taste is an individual experience, not a popularity contest. I hope to put more emphasis on this very soon with a new way to look at beer tasting.

  8. If awards are so random why do breweries like Pizza Port, Firestone Walker and Devils Backbone consistently win them.. it’s your money spend it how you want but i see this as more of an appeal to the beer geeks out there to show them how much you respect them so that they will buy your beer. Am I the only one who wants less talk from Modern Times until they have a product on the market.

  9. “@Jmac: I was being facetious.” I figured, just thought I would point it out because sometimes people joke about these things to a point where it’s not clear if they really mean it or not.

    A good point overall that you’re making. I think with the ratings “a narrow-minded preference for sledgehammer-type beers,” is the most important issue. It’s crazy how an excellent Pilsner or an English bitter will get mediocre ratings. But once you account for that, I agree with you that there is a lot of useful information in the ratings. At least a lot more than from the awards.

  10. @Joe Callender: I think your point about art is totally valid. It’s just kind of an “agree to disagree” situation, but I respect your view. Personally, I think there’s still value in scores even if there’s always going to be a subjective component to them.

    Cheers,
    Jacob McKean
    Modern Times Beer

  11. Competitions like the one at the GABF have no credibility for me simply because they use the BJCP guidelines as gospel (same goes for the “CIcerone” program).
    As to the beer rating sites, they can be entertaining to read but that’s about it. I certainly wouldn’t rely on them for any kind of objective assessments of a given brew.

  12. @LuskusDelph, if you are holding competitions, don’t you need to have a standard that you judge everything against? If you aren’t using BJCP guidelines, what would you use. I’m not saying you are wrong, I’m just curious. BJCP guidelines are based on classic styles and are a standardization that is used to compare like and unlike beers. Aren’t the style guidelines used to reduce the amount of subjectivity? Clearly you can’t get rid of it all.

    I’m really not sure what the alternative could be. Would you open it up to more types of specialty categories? If you didn’t have a standard, how would you effectively judge great beers against each other across styles – one being a Kolsch, one being a DIPA, and one being an Imperial Bourbon Stout. These beers are so different from each other it would be hard to compare them strictly from a palate standpoint.

    Perhaps there is a creative solution…

  13. I think we should all take note that BJCP judges are used for homebrew competitions not for commercial competitions. GABF and WBC use industry professionals, not beer geeks, to decide the winners. Also, both comps use style guidelines and categories that differ from BJCP.

  14. I have a question…..Since when did a brewery in planning, soon to open, slated, etc… whatever…..get the same press and chance at an “insightful” op-ed as a seasoned craft beer company. I just find it ridiculous that “in planners” have months if not a year or more of media and press in the last few years, almost like politics and campaigning, its lame and shameless. Here is an Idea…bust your ass….pipe down…….get all your ducks in a row…keep your cards close….do some very calculated pre-press a few weeks before opening and blow everyone away. Its business 101, people get the shits of listening to someone pontificate only to bump out there opening date….then bump it out again…..then again. And its not just this guy its every in planning from East to West. Here’s a news flash, this industry is tough, it will tear you up quick. I hope all of these newbies are ready for the challenges at hand.

  15. i think you’ll be singing a different tune Jacob when your (insert style here) gets shitty reviews because people are judging & “scoring” it, readily admitting they don’t like (insert same style here) beers. those reviews affect your overall “score”, enough silly crappy reviews will turn legit consumers away from your product without even tasting it. good luck w/ your start up, hope reviews are good if your customer base puts stock in them… i certainly don’t, & look forward to forming my own opinion about your beers.

  16. Pingback: Musings about the World Beer Cup | eat.drink.give.go

Leave a Reply