Trust me, I’m a wine expert

I am fascinated by any scientific study in the field of wine tasting.  So often the results challenge conventional thinking in the wine world and provide much food for thought. Here I shall describe just one piece of research that I think deserves greater recognition.  It is published in The Wine Trials book, and an academic paper that you can download for free.  Do take a look at the paper, but I am not sure I would advise buying the book.  I have the 2008 edition, and most of it is devoted to “100 wine recommendations under $15”.  I enjoyed some of the commentary in the earlier chapters, but I’m a sucker for that sort of thing and even I am not convinced it justifies the purchase price.

The study involved 17 blind tasting events in the USA, held in 2006 and 2007.  There were 506 participants, and 523 wines.  In total 6,175 samples were tasted and rated.   For analysis purposes the participants were classified as expert or non-expert tasters.  Experts were defined as those having had some formal wine training.

The main result is that while the experts’ ratings correlated with price, the non-experts actually preferred cheaper wines.

To give a feel for the magnitude of the effect, the authors give an example of the predictions of one of the models they fitted to the data.  Using the 100 point scale, if there were 2 wines, one costing 10 times as much as the other, experts would rate the expensive bottle seven points higher than the cheaper one, but non-experts would rate it 4 points lower.  The book contains a paragraph of specific results, which I think are useful to put this into perspective: “On the whole, tasters preferred a nine-dollar Beringer Founders’ Estate Cabernet  Sauvignon to a £120 wine from the same grape and the same producer: Beringer Private Reserve Cabernet Sauvignon. They preferred a six-dollar Vinho Verde from Portugal to a £40 Cakebread Chardonnay and a £50 Chassagne-Montrachet 1er Cru from Louis Latour.  And when we concealed the labels and prices of 27 sparkling wines and asked people to rate them, the Dom Pérignon finished 17th – behind 14 sparkling wines that cost less than $15, eight of which cost less than $10.”

There is one very practical lesson to be drawn from this study: if you consider yourself a non-expert you would probably do best ignoring recommendations from experts!

But what is really going on here?  There is probably no single explanation.  A few possibilities spring to mind, but I think the main reason is that the wine trade, from producers to critics, is too inward-looking.  The trade decides amongst themselves what defines a good wine, prices wines accordingly, and then seeks to educate neophytes in the mysteries of the art.  Meanwhile, everyone else feels too intimidated by the whole thing to question the clothes of the emperor.  It seems to me that the negative correlation between ratings and prices indicates that the wine market is organised very strangely.

Does it matter?  Well, yes, it has some very important consequences if sellers of wine are hoping that their punters are readily going to part with more money to get a more enjoyable product.  From my reading of the situation it seems that most drinkers are only likely to trade up if they get so interested in wine that they attend a wine course, or if they decide they need to impress by serving a wine with a prestigious label.

Perhaps that is just the way of the world, but I would be really interested in exploring what non-experts tend to enjoy as a group.  Do they really just prefer sugary pap to Proper Wines?  Or is there a new wine aesthetic waiting to be discovered? Something that future wine makers could aim for with the resources that potential higher prices will yield?

Warts and all

Have you ever wondered why you so rarely seem to see negative reviews of wines?  Or indeed other things?  I am very much aware that the first few reviews on my blog have tended to be positive, so I shall start by answering for myself.

Initially at least, I decided only to write about what I know well, and by and large that is what I have done – certainly my restaurant reviews and longer tasting notes have been for restaurants and wines I am very familiar with.  I didn’t want to be proclaiming judgements based on one meal, or a quick slurp and a spit.  But unfortunately a by-product of that policy is that I have only written in detail about things that I like.  I try to show dedication to my blog, but I draw the line at repeating bad experiences just so I can say with conviction that it was truly bad.

The other reason I might feel tempted to put a positive spin on a wine on a wine that was not great, or more likely say nothing at all, is if I know and like the person that supplied it to me.  I hope using the word “supplied” does not sound too much like having a drug habit fed; I use it to cover both being offered wine by a friend, and being sold wine.  Naturally I do not want to sound ungrateful for freely offered wine, and criticising it in public might be taken as ingratitude, but to a lesser extent I find myself reluctant to criticise wine sold by a merchant I know well.  Though having said that, the careful reader of this blog will find some examples of the latter.

As for other critics… well, I know of at least one who thinks that there are so many bad and mediocre wines it is not worth writing about them, and even listing them it seems.  The consumer of tasting notes is thought only to be interested in hearing about good wines.  I am not at all sure about that.  If someone else has tried a wine and found it to be bad, I would rather not buy it myself to make the discovery independently.  And if no one mentions a wine, how am I meant to know whether it is of poor quality, or simply not assessed?

This is also frustrating for the consumer when reading the results of large wine competitions.  We get to know the wines with trophies, medals and commendations, but how are we to know whether DRC again neglected to submit the requisite number of bottles of La Tâche, or it was judged to be unworthy even of a commendation? And this is where I start to get cynical.  Such a large proportion of wines get medals that to be unclassified is not at all good.  And no producer would want to go to the expensive of entering a competition with the possibility of being slighted like that.  So if the competition published the failures, they would not get get anywhere near the number of contestants and probably the competition would not be viable.

To an extent I think the same applies when writers get sent samples or invited on a jolly – er, sorry, fact finding mission – to a wine producing region.  If there were too many bad reviews the offers of samples and trips would slowly dry up – in general, if not for individual writers.  I am not accusing anyone of professional misconduct here, but I think we have to accept that however hard writers and critics strive to be independent it is hard to be totally objective when your livelihood depends on freebies.  Besides which, as I have noted above, it is really difficult to be critical about a product that is associated with someone you have got to know, like the producer you met on that trip.  I think it is also fair to admit that we as consumers of wine writing get what we pay for.  It is all to easy these to expect to get opinions for free on the net, but those who give their opinions for free need the means to get hold of things to write about.

I am sure that part of the knack of getting the truth from a tasting note or more general review  lies in looking for what has not been said, but that sadly is still a bit like looking for wines that do not have medals.  Did the critic not mention the intensity of flavour because it was insipid, or because he did not think it worthwhile commenting on?  Or perhaps it was not intense, but had an understated elegance?  We will never know.

Another trick in tasting note deconstruction is to look at the score.  I did not realise it until it was explained to me, but apparently a score of below 90 means that a wine is not recommended, while anything you should consider buying will be in the range 90 to 100.  But sadly that now means that some critics are reluctant to give 89 points – so even the points cannot always be used as a coded hint that a wine is under-par.

But you can still get some glimpses of warts.  The blind panel tastings in Decanter for example.  There, often you will find first growths and similar summarily dismissed in favour of more modest wines. What I miss there though is an explanation of the thought processes of the taster.  Of course, better wines need more time to come around, but shouldn’t professionals be able to recognise a young but promising wine from a good stable?

Six suggestions for wine tasting hosts

 

The level of hospitality shown by producers is often very high, and I am grateful to them all.  However some tastings offered could be even better, and with very little effort and cost.  For all wine producers who host wine tastings for small groups, here are 6 things that are sometimes neglected.  Please don’t see them as demands, but as suggestions to be considered if you want to present yourself in the best possible light.

  1. A comfortable environment.  Outside can work, but often it is too hot, cold or windy.   Usually inside  – with a cool temperature and good lighting –  is best.
  2. Somewhere to rest glass and notebook .  A place to sit is nice, but I’d much rather stand by a bar-height table or ledge than have a chair with no table.
  3. A white surface, to show the colour of the wine.  If the table top is white that’s great.  If not, something like a sheet of A4 paper would be fine.
  4. Basic information about each wine.  The official designation, vineyard if on the label, the grape variety or blend, vintage, alcohol content. Give the information clearly, and repeat it.  Leave the bottle with us after pouring so we can see the label.
  5. Good access to spitoons.  And not just for emptying glasses – even amateurs sometimes want to spit.
  6. Serviette, or sheet of paper towel.  There are nearly always dribbles of wine to be dealt with.

I cannot emphasise point 4 enough.  You the producer are familiar with your wines, and are very keen to let us taste and give us more detailed information about the wine, but if we do not get the basic information all is lost.  Remember language barriers, and the fact that it might not be so easy to hear at the far end of the room.  In fact I would sugggest that, if at all possible, you  provide a sheet of information customised to the particular tasting you are giving us.  You could prepare a computer file with all your current releases, and delete the wines not being offered before printing it out.

When wine tastes best

For me the answer is… root days!

And isn’t that what you might expect if you subscribe to a rather literal interpretation of the importance of terroir? Or could it just be that the whole idea is a load of bollocks? I am of course talking about the biodynamic theory that lunar cycles affect the taste of wine, fruit days being the most auspicious.

Here’s what I did to test the hypothesis. I analysed all 568 of my tasting note scores from last year. The scores range from 1 to 6, corresponding to the number of stars in my rating system. At the time of tasting I was unaware of the type of day. I used this 2009 biodynamic calendar for the analysis. I presume it is reasonably accurate. I did check a few days against another calendar, and they were in agreement. I have no idea at what time the type of day changes on any particular date, but as I could not find this information and very few people seem to care, I decided to ignore the issue. Most  wines would have been tasted at some time in the evening. If you want to reanalyse my raw data feel free. In the meantime, here is my summary of scores awarded on each type of day .

Mean Std Dev Number tasted
Fruit 3.02 1.209 94
Leaf 3.28 1.057 102
Root 3.30 1.020 184
Flower 3.09 1.125 188

So, if anything, I think wines taste best on root days, and worst on fruit days. But actually there are barely any significant differences at all. A one-way ANOVA test gives a p-value of 0.091 level. Or to put it another way, one would expect to get such a large spread in the means about one time in ten purely from random variation.

As far as I am concerned I got pretty much the results I expected, and I don’t feel any need to research this issue further.  To be frank I think I have already given this nonsense a lot more time than it deserves. However, if you have any more evidence to bring to light I’d be interested in seeing it.  But please – no more anecdotes about tasting wines when you were aware what sort of day it was.  And no half-baked argument along the lines of “if Tesco believe in it, there must be something in it”.  Hard data only.

Or perhaps you could explain from a theoretical point of view why this agricultural calendar has any relevance at all for wine tasting.  Why should fruit days be any better than, say, Fridays – which is when I think wine tastes best.