Nudged away from nuggets: Is using AI to prompt healthier choices really creepy? 

by | 7 Nov 2024 | Blog

You may have seen in the news recently, that Tesco could use artificial intelligence (AI) to encourage their shoppers to make healthier choices. This news, unsurprisingly, led to a lot of concerns and discussions about AI going ‘too far’ and whether it’s right that a supermarket should influence consumer decisions.   

In my view, Tesco’s potential use of AI to nudge healthier choices is a great idea… in theory. But for it to be done well and ethically, it’s crucial the individual is in control. Not everyone is going to want to be nudged towards a healthy choice and people have differing needs throughout their lives. Perhaps you can only stomach certain food for a period – due to illness or pregnancy – or maybe you’re also shopping for your neighbour for a couple of weeks. So, for it to be truly helpful, shoppers must be able to opt in or out of the service, putting them in control of the purposes their data can be processed for. 

J Cromack

Is there any way to stop this feeling a bit… weird?  

A man with a camera peers through the blinds
Is this how you imagine the Tesco data team?

I always use the analogy – if you’re sitting across the kitchen table from your gran and you explain what you’re doing and it starts to sound a bit creepy, it probably is!  

Hey Gran, we look at all the products you’ve purchased over the past 12 months and those card transactions from the debit card you linked to our loyalty account over the past 36 months, and then start to make decisions on your lifestyle, such as what you like, what you don’t like and what we think you should be eating to keep you healthy…” 

I’m not saying Tesco Clubcard would specifically do this. And it doesn’t sound too creepy as I know it’s happening and have control over what data I’m sharing, and understand how it can benefit me. However, if I’ve no idea (or it’s buried in T&Cs), it feels more than a little creepy, and who knows where my data may end up? 

My recommendation on tackling all this is to be transparent. Be clear with the customer what data is being processed, and for what purposes, and empower them to make decisions over how their data can and can’t be used. Give them meaningful control, but using simple language and design. Also, be clear AI is being used to make personalised decisions for the individual. Transparency builds trust, and trust generates loyalty.  

You know the saying about making assumptions 

A healthy choice for one person may not be the right choice for another. Supermarkets also need to be careful what they’re inferring about a person, as they’re often missing vital data to help make those recommendations. For example, if Generative AI infers someone might be diabetic based on purchase history, you’re now creating ‘special category data’ and you may need to process it differently. The chances are these assumptions may be wrong too, so accuracy also becomes an issue. If any of the major supermarkets do this, I’m sure they’ll have compliance teams all over it, but retaining control and auditability of the model is going to be critical. 

Clarity reduces creepiness 

If it isn’t clear to a consumer how much of their data is being held, then I’d argue the supermarkets/retailers holding this data are likely in breach of UK GDPR. Transparency is one of the fundamental requirements of 5.1.a, which states personal data shall be processed lawfully, fairly and in a transparent manner. 

I think COVID-19 made consumers a lot more aware of data collection, with using QR codes to enter a pub or the NHS App, etc. Data collection is part of everyday life, and we rightly expect organisations to ensure our data is processed in line with all the regulations. 

It’s great to see how retailers, especially supermarkets, are now visualising our data for our benefit, such as spend per category and savings generated. This is adding value to my data for my benefit, which is creating a mutual value exchange. 

Treat your data like a date 

Champagne and roses on a blue background
Woo the information from your clients

The concept of ‘nudging’ is a core fundamental of behavioural economics and ideally this should be subtle. That said, in my opinion, a retailer can build trust with the consumer if they’re taken on the journey with them, whilst progressively enriching this information. I’ve always used the analogy that the data game is a bit like the dating game – you don’t jump in and ask EVERYTHING about someone at the start, that’s just plain creepy. You should learn more about a person over time, as you build a relationship and earn their trust. 

You could programmatically ask questions at different stages of the consumer journey where you think having a bit more information about someone might help make a better recommendation or deliver a more discerning nudge. It’s a great opportunity to engage a consumer and ask them – rather than just infer – whilst being really clear why you’re asking the question. AI is giving us the ability to do this more effectively and at scale. 

The downside

If the correct ethical AI framework isn’t in place, or being followed, to mitigate all the potential risks and biases when it comes to hyper-personalised ads and shopping experiences, then retailers could expose potentially catastrophic risks.  

We’ve all heard the story about Target in the US creating a pregnancy prediction score and then sending baby product offers to consumers, with a father then finding out his daughter was pregnant because of the offers she received. Apart from it being creepy and prohibited under UK GDPR Article 9 because it’s special category data (even inferred), it’s also being processed without consent (or valid exception), and could cause massive reputational damage. 

If you’ve ever read a book called The Circle by Dave Eggers, you’ll know what dystopian future might look like. Reading this book challenged my ethics when it came to data mining and how I understand human behaviour, making me look at how we could use gentle nudges to influence that behaviour. It helped me understand why new regulations, such as the UK GDPR, were needed to catch up with the digital age. If data trust in organisations declines and people start to go off grid, as per The Circle, a digital society would be halted and we’d all suffer – society, business and the people themselves. 

A vision for the future 

It’s great to see the new Labour government is making ‘smart data’ a core pillar of the new data regulation bill. My vision of smart data is making the individual the data controller, not the organisation – this is the utopia. Gen AI give us the chance to do the processing and  nudging on the person’s device. The data stays with them but new UX designs and voice command will make these types of solutions more accessible. 

Back in 2018, when I presented at the MyData Conference in Helsinki, I showed a conceptual product which had my own personal data stored on my device that synced with my Tesco Clubcard transactional data, as well as my MyFitnessPal app. I was in control through simple iconography of the data I wanted to share with each party for defined purposes.  

Being able to get MyFitnessPal to give me recipe recommendations without having to scan all the barcodes of the products I’d bought that week was a no brainer. Then I could share my health and fitness goals from the app back to Tesco, so they could make genuine healthy nudges and recommendations – creating a personalised experience I was invested in. 

I’m hoping this is now a vision of the near future, as we can use AI to easily build data integrations, standardise schemas and create personalised recommendation at device level. This is a future I’d buy into – but I want the choice. 

Get in touch today

Interested in speaking to us? We'd love to hear from you.