This editorial was contributed by Jacob McKean, founder of Modern Times Beer, a brewery-in-planning in San Diego. Previously, he was Communications Specialist at Stone Brewing Co., and he has also worked as a freelance writer for a smattering of beer publications. He’s a beer geek and longtime reader of this site. You can follow the development of Modern Times Beer at moderntimesbeer.com and on Facebook and Twitter.
Something has always bothered me about beer industry gatherings: the inevitable negative comments about beer ratings. Put enough brewers in a room together, crack open a few beers, and without fail, someone will bemoan the unfair, unqualified reviews of their beers on ratings sites.
At least that’s been my experience. I’ve only been in the industry for a few years, but it’s happened enough times that I feel compelled to offer a counterpoint. Personally, I can’t wait for people to start reviewing my beers online.
First, let me offer a big ol’ qualification: beer reviews and ratings websites are far from perfect. There are all kinds of ways in which they are suboptimal: imperfect tasting conditions, geographical bias, a ticker mentality that favors novelty over consistency, raters who use the sites as personal tasting archives, a narrow-minded preference for sledgehammer-type beers, and the list goes on.
But, quite simply, there is no such thing as a perfect ratings system. And on the whole, beer ratings—which are dominated by the sites BeerAdvocate and RateBeer—are a good indication of a beer’s quality more often than not.
So please: bring on the ratings of Modern Times beers (as soon as they exist). I want doctors in Ohio, lawyers in Orange County, and accountants in Louisiana to taste my beer in front of a computer and publish their judgments for all the world to read.
It’s not that I’m hungry for someone to unfairly compare my beer to someone else’s, or to thrash my beer based on an ancient, ill-treated sample. It’s that I know, on the whole, the average of all those ratings will eventually be the best, most qualified feedback I will receive about my beer.
I will get to read the judgments of people who, in many cases, have tried a tremendous amount of craft beer, know a great deal about brewing, and also—crucially—have no investment in whether or not I succeed. They will give their unvarnished, aficionado’s opinion—unlike the friends, family, and employees who are constantly saying that all of our beer is brilliant.
Also, let’s take a moment to consider how remarkable it is that anyone is passionate enough about what we make to write lengthy, detailed reviews of it. There is simply no denying that beer ratings websites have played a crucial role in popularizing craft beer and putting our industry on the cusp of mainstream success. A brewery like mine can plausibly enter the industry with a marketing budget of zero in large part due to the efforts of passionate beer geeks who spread the good word about good beer.
So yes, there will be times when beer reviews will be maddening and beer geeks will be frustrating. That comes with the territory. But we should at least acknowledge that the imperfect system of online beer ratings has made us better and stronger.
It’s also worth considering the alternative: wine ratings. Instead of the occasional off-base review on ratings websites, imagine if our livelihoods as brewers were determined by one or two self-proclaimed super tasters whose personal biases dictated the commercial success of any given beer. By comparison, the profoundly democratic nature of beer ratings are a model of fairness.
But what about competitions, you say? Surely medals awarded by learned judges are a better determinant of quality than the drunken musings of 400-pound men sweating over keyboards in their mothers’ fetid basements.
In the 7 years I’ve been involved in craft beer, I have never once looked at the list of winners in a beer competition and thought to myself, “These are clearly—or even arguably—the best beers in their styles, in more than a few cases.”
In fact, the winners of beer competitions often seem nearly random. That is because, in all likelihood, they are.
Wine competitions—despite employing rigorously trained judges, controlled environments, and blind tastings—distribute medals essentially at random. The chances of a wine winning a gold medal in any given competition is 9%, no matter how good the wine or how prestigious the competition.
Although a similar study hasn’t been done on beer competitions, there’s no reason to believe the results would be much different. And my experience indicates that they are not.
Every time I’ve gone to the GABF, I race around to taste the winners after the awards have been announced. Obviously I don’t get to try all of them or form my judgment in the same environment as the judges, but without fail, the exercise just leads me to confusion, not affirmation. Mirroring the results of the wine study, there are always a few clearly deserving winners—which has the unfortunate result of validating the whole exercise in the minds of most people—but many that are not.
That is, of course, a subjective opinion, but enough people whose views I trust—many of whom stand to gain a great deal by winning an award—agree with me to make me think that beer competitions are exactly like wine competitions: more or less random.
I may be wrong. Perhaps a rigorous scientific study of beer competitions will show that they are totally unlike wine competitions. But I suspect that they are not.
For that reason, I will not be spending thousands of dollars on entry fees for my own beers, and I will not be waiting breathlessly to hear the results of the GABF judging or the World Beer Cup.
But I am truly very excited to read the first online review of my beer. Even if I disagree with it.