Choice. The problem is choice.

Photo by Victoriano Izquierdo on Unsplash

Business, like life, is all about making choices. Making the best choices you can with the information you have to hand and hoping things go your way.

Where should my next store be? Where should I advertise? How should I segment my customers? Who should I target with this offer? What products should I range in this store? What is the right creative message?

Sometimes the choices you make come off. Sometimes they fail. But ideally, you want to give yourself the highest probability that you’re making the right ones.

That, in theory, is where insights and insight development come in. Insights are sometimes talked about as something definitive that will give you a 100% correct answer and therefore an easy choice. In my 20 years of experience working with and running insight teams that is massively rare.

So given I seem to have some time on my hands, I wanted to try to and help those trying to create a more insight-led approach in their business culture. I am no stats, analytics or research expert but I think I know enough to add some value. Either that or you’re about to read a great example of the Dunning-Kruger effect.*

1. Be sure you know what an insight is

Mirriam-Webster describes an insight as:

An understanding of the true nature of something.

That to me means that for something to be considered an insight it needs to tick a few boxes:

  • It can’t just be an opinion or observation without any proof or validation.
  • It can’t just be a fact. For example, “25% of people don’t have life insurance” is a fact, not an insight.
  • It must have a “Why?” and it must be substantiated. So “X Research shows that 25% of people don’t have life insurance because they don’t have dependents and, therefore, don’t believe it applies to them.” is an insight that can be worked with.

2. Start with questions, not answers

Our insight briefs at DDB Sydney were built around the same premise as our creative briefs. You don’t tell talented creatives that you want x image, y copy and z colour palette (unless you’re a moron). You give them a great creative proposition, some compelling insights to back that up and let them do their thing.

The same is true of Analysts, Data scientists or Market researchers in my experience. Give them a question to answer — don’t give them a table or chart you want to see. Don’t tell them how to do a job that they are better at than you.

The key to getting great insight work is therefore very similar to getting great creative work — a great brief. Really think through your questions and always ask yourself: why do you want to know the answer to this question? Is it useful or just interesting? What insight will this give me that will make a difference in how we operate?

3. Answers also mean more questions

Any piece of insight work will give you some answers, but I guarantee that it will also present you even more questions. Embrace that. Insight generation tends not to be one and done but requires multiple iterations as you uncover more and more.

The key is to know when to stop and that can be obvious, determined by timeframes or a case of judgement. But if you’re always clear about why you’re asking a question you have a good shot at stopping at the right point.

4. Learn how to interpret data.

To bastardise a quote from Professor Aaron Levenstein: Statistics are like underwear. What it reveals is suggestive, but what it conceals is vital.

Here are a few ideas on things to look out for:

a. See past face value and put insights and facts into context.

Let’s say only 10% of people who buy a new games console return to the place they bought it within 6 months. Well at first glance that sounds pretty crappy and cause for concern.

But is it? Is it different for different consoles? Is there a seasonal element to it? Is it better or worse than last year? Does the amount of products you buy in the same transaction as the console make a difference (e.g. extra controller, games, etc)?

There are loads of variables that can contribute to giving you different perspectives on how much of an issue that initial number is before you go off half-cocked trying to “solve” the problem.

b. Beware of sneaky ways data can be manipulated to tell a specific story.

For example, changing axis to make differences on graphs seem bigger, or showing percentage increases without showing what something has increased from and to.

c. Correlation not causation

Everyone loves stories, but our propensity to apply them sometimes leads to incorrect conclusions. For example, just because two things are correlated, doesn't mean that there is a causal link between the two. Here’s a great example.

A correlation is required for a causal link but you can only really identify the latter through experimental design.

In short, don’t take numbers at face value. Question. Be a sceptic.

5. Consider the source (and show it)

Simple desk research can be a great source of insight. There are tons of great reports, analysis, research, white papers, books and more out there. Just make sure you do two things:

a. Consider how the source of the data can colour both the way the insight was collected and analysed.

Also, consider how it was interpreted and by whom. You might find that insight about direct mail being hugely popular was funded by a Print house. It doesn’t necessarily invalidate the findings but it certainly should be taken into account.

b. Please, please, please list your sources for any insight.

Otherwise, you could just be making shit up. This also applies to insights in creative briefs where I have seen this lacking. If you’re using a pithy observation about the human condition as an insight, you still need to ground that in something — a psychological study, a research paper, market research. Which leads me to…..

6. Insights first, story second.

I remember having a conversation with the MD of a company I worked for. I owned the relationship with our fantastic analytics agency and she was building a presentation for which she needed some insights. Sounds fine.

But she had her story already laid out and wanted to back up that story with some cherry-picked data. That’s dangerous. Our brains are wired to construct and believe stories that seem plausible to us, but they don’t necessarily have to be true**. Coming up with a plausible story that you think you can sell to your boss or a client as then finding insights to fit is frankly a bit cheap. If the insight isn’t there your story is iffy and you need to re-think.

There’s no harm in having a question based on a hunch but if the insight doesn't back it up then you need to move on.

7. Watch out for your, and other people’s biases.

We are essentially slaves to our psychological biases and they colour how we look at data. For example, Confirmation Bias is the idea that people will discredit insights that contradict their world view and embrace those that confirm it.

I can remember vividly many occasions when presenting solid insights where people have said something like: “Well that’s as maybe but when my son was looking for x on the website he couldn’t find it.”. Fine but that doesn’t invalidate the findings that were being presented and it certainly doesn’t mean we use that one anecdote to drive our strategy.

8. Culture is Crucial

For insight (and, by proxy, insight teams) to be seen as a strategic advantage, it requires a company-wide culture that embraces Insight. That means a few things:

a. The Insight team has to understand the commercials of the business

I have worked with analysts and data scientists who are brilliant at analytics but lacked the commercial nous needed to apply their insights to a real-world business. Suggesting an insight-led product price when it delivers negative margin won’t win you any friends and worse, will undermine the value of insight.

b. The leadership team must be invested and demand others are too.

If your C-suite or leadership team demands insight-driven action and plans then it filters down. If they don’t then it won’t and people will continue as they have before.

c. A spirit of experimentation.

Insight development creates questions and hypotheses. These need to be tested through solid experimentation to assess if they are useful. Does that new wallbay of product really drive sales? If we change our window display based on insight do we actually get more people in the store? With strong experimental design, you can get in-world proof that your insight has value. And never assume new is better.

The bottom line.

We tend to develop stories based on the information at hand; piecing the information we do have into a narrative, often without asking the question, “What information am I missing”?.

Insights allow you to plug those gaps and improve your odds of making a good choice compared to gut feel. Just be wary of how you use, create and apply them and what you expect of them — it’s not always going to give you the complete answer you want.

I am an evidence-driven marketer and consultant with over 20 years of experience across both client and agency side. I have applied my knowledge across brands as diverse as Vodafone, McDonald’s, Volkswagen, Disney, Westpac and GAME

If you liked this article please like and share with your social network.

Thanks to Jen Clinehens for the editing and advice and Mark Razzell for co-writing the presentation with me on which parts of this article are based.

*https://en.wikipedia.org/wiki/Dunning%E2%80%93Kruger_effect

**https://play.google.com/store/books/details/Daniel_Kahneman_Thinking_Fast_and_Slow?id=oV1tXT3HigoC

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Rob Voase

Rob Voase

Over twenty years of marketing experience in big brands, small brands, agency & client-side. I’ve worked in Australia and the UK and still miss Sydney daily.