Posts Tagged ‘David Bonney’

Del 5: My thoughts on effectiveness

juli 12, 2009

David Bonney, frilansande kreativ planner baserad i Berlin med bland annat DDB/Tribal DDB och McCann Erickson i London på CV:et, avslutar sin följetong om effekt från ett internationellt och något cyniskt perspektiv med en något oroväckande frågeställning.

5) Does effectiveness ruin planners?
I don’t know the answer to this one either. But let me finish with a warning that I feel in my bowels – the more it is a planner’s job to think about effectiveness, the less he will be free to dream and guess and feel and intuit the extraordinary half-crazy ideas that can lead to wonderful, risk-embracing, business-exploding briefs. We planners live in a dark place between creativity and analysis, but I think the trend is for us to move slowly towards the analysis side. Enough I say, can’t we get someone else to do that? We planners have to be amongst the finest creatives in our agencies, we have to be untethered and open and ambitious when inspiring our creatives… and if you don’t have a planner who is free to be this way, then you won’t have great work coming out of your agency… whatever “great” is of course.

What is more, in the last 20 years, the IPA in London has accumulated over 800 IPA Effectiveness case studies. Papers written selectively, with bias and without creative pride. And now the well-intentioned scientists amongst us, Les Binet and Peter Field for example, cannot resist meta-analysing these papers, taking them at their word, and on their basis drawing rigorous conclusions as to “what works” and what doesn’t in advertising. If I didn’t intuitively agree with most of their findings (e.g. on the power of emotion) I’d be very worried indeed.

That’s all I can think to say right now. I’m a little fickle, so maybe I’ll disagree with some of these thoughts come tomorrow, but I hope at the least it provokes some healthy debate

Annonser

Del 4: My thoughts on effectiveness

juli 7, 2009

David Bonney, frilansande kreativ planner baserad i Berlin med bland annat DDB/Tribal DDB och McCann Erickson i London på CV:et, delar med sig av sina tankar om effekttävlingar från ett internationellt och något cyniskt perspektiv. Efter tre tidigare inlägg avslutar han sina tankar om effekt i veckan.

4) Who reads effectiveness case studies?
I’m really not convinced the right people read them. Do the city traders who calculate the value of Unilever or Vodafone really inform their decisions with a sound belief that branding, and brand communications to boot, are key strengths for these companies? I don’t know the answer to this one.

Del 3: My thoughts on effectiveness

maj 27, 2009

David Bonney, frilansande kreativ planner baserad i Berlin med bland annat DDB/Tribal DDB och McCann Erickson i London på CV:et, delar med sig av sina tankar om effekttävlingar från ett internationellt och något cyniskt perspektiv. Del tre av fem.

3) Are we scared to know whether our intuitions are right?

I said earlier that I avoid reading case studies unless I really have to. But why?

Well, if you look at most of the campaigns that have won IPA gongs in recent years, often you will never have heard of them. Do you remember getting excited when campaigns broke for Learndirect, Morrissons, Cathedral City Cheese, Radley, Felix, O2, Branston Beans or ING Direct? Do you remember where you were when you heard they’d won in Cannes?

Why do we the campaigns that excite us creatively just not make it into effectiveness awards? And why don’t we force them into effectiveness awards? Or are we scared that our instincts will be shown to be misguided, that years of ‘creative knows best’ and gut-instinct have not created the value we believe it has?

Really, if we want to continue to sell ourselves as the intuitive masters of demand creation, the subtle manipulators of a complex and powerful art, then we should put into effectiveness awards the campaigns we love best, the ones we really believe in, the ones we WANT to be effective. Maybe we take the 20 most critically acclaimed campaigns of any year and force them through a gruelling process of accountability…. then we’d be a grown-up industry, with no need to hide behind posturing or flimsy creative awards.

At worst, maybe we’ll learn that campaigns by O2 and ING are best-practice for their consistency, their simple emotional appeals, their idealessness and their ability to work on a visceral level?

Del 2: My thoughts on effectiveness

maj 26, 2009

David Bonney är frilansande kreativ planner baserad i Berlin. Han har jobbat för bland annat DDB/Tribal DDB och McCann Erickson i London samt Plantage i Berlin. Det här är del två av fem tankar kring effekttävlingar som David delar med från ett internationellt och något cyniskt perspektiv.

2) Unneffectiveness Awards

Of course, you may argue, it is impossible to conduct effectiveness studies on all advertising campaigns. Maybe what we can best hope for is a random, representative sample of campaigns, chosen out of a hat. And surely the people conducting the studies should be wholly independent? Either the advertising trade body itself, or clients, or maybe Fallon would get to write “Uneffectiveness Case Studies” for BBH and vice versa? Then we could really trust that the authors are good scientists, out to disprove advertising effectiveness.

My thoughts on effectiveness

maj 25, 2009

David Bonney

David Bonney är frilansande kreativ planner baserad i Berlin. Han har jobbat för bland annat DDB/Tribal DDB och McCann Erickson i London samt Plantage i Berlin. En av hans äldsta drömmar är att bo och arbeta i Stockholm. Här på Bloggen om effekt vädrar David sina åsikter om effekttävlingar från ett internationellt och något cyniskt perspektiv. Publiceras i fem delar under fem dagar.

Johan Östlund is a dear old friend and former colleague of mine at DDB London, where we were both planners back in the day.

Johan has asked me to share with you my thoughts on advertising effectiveness awards. But, to be honest, I’m not really sure why – I promise you there are some really good reasons why my thoughts are no more worthwhile than the next man’s.

Firstly, I’m not incredibly experienced when it comes to effectiveness awards – I’m not a patch on the great Les Binet (all hail Les) and I’ve only ever been exposed to one competition, the IPA Effectiveness Awards in the UK, for which I’ve written one entry which, quite rightly, didn’t win a thing.

Secondly, I’m not the most passionate advocate of effectiveness – I’m more of a “front-end”, creative planner, preferring to feel my way along and shed blood for the idea, the inspiration, the intuition, rather than the “back-end” aspects of accountability, econometrics and ROI.

Thirdly, I don’t feel all that educated about effectiveness – ok, not true, I recently completed the IPA Excellence Diploma which is like an MBA for advertising practitioners and is meant to put me amongst the most educated planners in the world. But, I admit I haven’t read an effectiveness paper in two years and I studiously ignore the IPA case studies when they get published (more on this later).

But, despite all this, I am happy to share my views (just try shutting me up!)

1) The un-science of effectiveness awards

I trained as a psychologist before going into the seedy world of advertising. Psychology is a science and in science you set out to disprove a hypothesis, to show that something isn’t true, that male pheromones don’t make females play computer games better or that a 30’ ad from a chocolate company doesn’t increase sales of chocolate. Now anyone conducting econometrics would argue that this methodology seeks to disprove – to explain the other things, besides communications, that can explain a brand’s growth. Fair enough.

But if you take a step back from the statistical methodology employed in the wiritng of effectiveness awards, the rest of our approach to “disproving effectiveness” is patently wrong. First, the agencies that produce the communications are responsible for conducting the “science” themselves – and it is clearly always in their interests to prove rather than disprove effectiveness. Now, excited at the prospect of being able to call themselves the most effective agency in the land, these agencies look at all the campaigns they have launched over the previous three years and whittle the list down to only those that may be twisted into plausible case studies of effectiveness. I’ve gone through this process with two leading agencies in London, and each time it was clear that only a small minority of our campaigns were deemed sufficiently effective or provable to be written up as case studies.

There’s our first mistake – surely every campaign that ever ran is evidence for or against the effectiveness of advertising? And surely a scientific method cannot willy-nilly decide which bits of the results they want to keep and which to get rid of? I binned half the results of the research for my undergraduate thesis and I repress the guilt to this day.

Fortsättning följer imorgon.