Is Personalisation A Step Too Far?

This week's Gartner for Marketers report is hugely interesting.

The main headline was that personalisation is going to be abandoned by 80% of marketers by 2025 but that, for me at least, wasn't the real take-out of what were a bunch of predictions stretching beyond the usual December installments of blogs looking ahead to next year.

To finish off the personalisation point, the tricky issue behind four in five giving up on the efforts by 2025 is a combination of teams not seeing the ROI and concerns over using data to draw out personal insights on customers and prospects. There were a few tips on how to get around these issues but, to be honest, the fact the tips had to be given on how to handle data underlines the importance of what follows in the report. 

With their future-gazing hats on, the Gartner researchers are forecasting that, and I quote:

  • By 2024, artificial intelligence identification of emotions will influence more than half of the online advertisements you see.
  • By 2022, 25% of marketing departments will have a dedicated behavioral scientist or ethnographer as part of their full-time staff.
  • In 2023, one-third of all brand public relations disasters will result from data ethics failures.

I've been talking to companies about emotion in advertising and how it can be used to predict which ads strike a chord with us human beings, and which ones just go unnoticed.

It would be an obvious extension, then, to equip ad networks with an ability to pick up on the emotion of a viewer, listen or reader and serve up a relevant ad. 

This reliance on emotion and how it influences behaviour will then lead to people who specialise in this area, which leads us to the very real question of the ethics of reading emotion and behaviour in the first place.

By definition, it means trying to lift the lid on consumer behaviour and what they are feeling at any particular time is going to lead to major ethics questions which could lead a brand into hot water.

One can almost imagine the outcry if a brand were to target, for example, sad young females online with chocolate ads or filling a young man's computer screen with dating sites after detecting they are feeling down after a relationship ending.

This brings us around to personalisation and the ethics behind knowing so much about each customer. Hence, marketers make the point to Gartner that the biggest obstacle is gathering and processing enough data on a consumer for it to be helpful and yet at the same time, not break the law or look creepy. 

I don't know about you, but does this kind of feel that the prediction is marketers will give up on personalisation and instead begin to track mood and emotions. Does the prediction then not also hint this will lead to a third of all their PR disasters being caused by -- you guessed it -- intruding on people's emotions and using data in other ways that are ethically questionable.

So the following may fall on unreceptive ears, but I'm wondering whether we need to leave a little bit of marketing and advertising budget to chance.

Rather than kid ourself we can completely understand each customer well enough to have many thousands of individual one-to-one conversations, isn't it about time we realise two limitations -- what we can do and what we should do. 

Personalisation in an email has it place -- "hey you ordered this, wanna upgrade to this" is a useful message that is actually a rules-based campaign initiative that probably applied to many other customers on any particular day.

Such narrow, focussed customer groupings are fine and helpful. It's when a brand considers letting you know an upgrade would help solve that emotional issue you've been going through and there's a pharmacy down the road with the cream you need for that personal rash, that's where it gets beyond creepy.

So, maybe it's not a bad thing that brands are giving up on personalisation, if it's the creepy kind, and they're going to focus instead of building small audience segments and smart rules-based campaigns.

It has to make a lot more sense than giving up on personalisation, because the rules around data are hard, and then deciding instead to open up a bear trap of selling ads based around detected emotions. 

Next story loading loading..