As the machines rise, earning your customer’s trust is more important than ever

23.05.2018

Right now, it’s hard to escape talk of the brave - but opaque - new worlds of algorithms, machine learning and AI. These seem to promise the ability to anticipate consumer needs, predict their behaviour before they know themselves, and for brands to be generally much smarter about how they ‘manage’ their customers.

But, for customers there’s a serious risk that many brands will get the balance wrong between the greater convenience that technology brings and the dehumanising effect of being ‘too smart’. If decisions about, for example, how much an insurance premium should cost are influenced by seemingly irrelevant factors such as social media activity, it won’t take long for consumers to react very negatively towards brands that get it wrong.

What’s more, these concerns have not escaped the notice of policy-makers. A House of Lords Select Committee on Artificial Intelligence has recently called for greater transparency – allowing people to see exactly how algorithms are used by companies to make decisions that directly impact their lives.

Coupled with scandals over the mis-use of data and a general atmosphere of GDPR hype, we believe this makes it all the more vital that brands double-down on getting the basics right; understanding what their customers truly value from them and how best to deliver it.

By all means let’s take advantage of technologies where they help, but we should never be tempted to rely on a ‘black box’ approach that forgets that customers are human too,

Customers do indeed have rising expectations for seamless experiences, but they also want to feel that they are being treated as individuals. The art is in knowing how to design an experience around the customer not the algorithm.

At Draw we have long recognised this challenge and have developed an approach based around undertaking an insight-led but low risk – and low cost – pilot to test customer experience concepts. 

Get in touch and we’ll tell you more.


Related News