How to tackle the privacy challenge when utilising big data insights

Add bookmark

Big Data insights have become indispensable to optimise customer experience strategies and personalise customer journeys, but how do you ensure you don’t break down the privacy barriers and lose invaluable customer trust as a result?

If you ask the public what they think about the growing use of Big Data, you’ll get an answer that can be summarised as ‘mixed blessing’. Yes, it’s good to get offered things online at the right time and in the right place. And it’s even better if we get things like improved public transport because of clever insight into traffic flows. But no, we aren’t comfortable that our habits have all become predictable to faceless programmers, and we certainly aren’t comfortable that ‘our data’ is being used in ways we don’t know about.

As Big Data sweeps all before it, in its train comes a growing body of consumer protection rules. Some uses of Big Data come with very few, or no, privacy issues; for example, using transaction patterns to detect credit card fraud, or using wind and weather numbers to improve energy efficiency. But others, especially when they shape what a company does or doesn’t offer to an individual, bring privacy issues that cannot be ignored.

A Xerox-Forrester survey* found that companies wanting to use Big Data rated ‘data security and privacy’ as their number 1 obstacle. The data processing, analytics and insight were the easy bits!

There are now some clear guiding principles on data use and privacy that have been put forward by, among others, the UK’s Information Commissioner. They give Big Data users a strong framework from which to assess whether they are being fair to consumers. It’s my view that if companies collect and use data well, they can build trust, and then data privacy can become an opportunity as much as a challenge.

Key Principles for Fair Data Processing

Whether you look at the EU level, in the US, or elsewhere, consumer protection rules and penalties are in flux, but you can be pretty sure that if you’re using the data for any kind of sales process, you’ll need to follow these key principles:

1 Transparency – be open about what you’re doing with the data;
2 Consent – collect the consumer’s consent, making clear what you’ll use the data for;
3 Security – build security into all your data processes;
4 Minimisation – keep data only if you intend to use it, and it is current. Get rid of data you are finished with;
5 Access – If asked, be able to show people how and where their data has been used.

Such common-sense principles turn out to be hard to follow in practice. For example, how do you get consent for all the possible uses data analytics might drive? How do you explain to someone exactly what goes into the logic that choses them for an offer and not someone else?

The honest answer is, you can’t, at least not all of the time. It’s not practical and it isn’t always wanted by the customer. What you can do is to bring the right approach, setting up the right structure and asking yourself the right questions as you manage and process large sets of personal data.

How to Manage the Big Data Privacy Issue

Top priority: Build customer trust

You need to think of the people in your data (‘data subjects’ is the jargon) as partners, starting from when you first collect data from them. Make it clear that you are asking for something in order to improve what you offer or what you do.

Lengthy terms and conditions, which nobody reads through, are to be avoided – people may scroll through them or tick them, but that doesn’t mean they trust them! Instead, summarise in bullets what people are signing up to and invite them to dig to the final level of detail only if they want to.

They should feel that the company has nothing to hide, and that they only need to ask to be given an open account of what their data is being used for.

Set up and run a robust data processing structure


For your company, the value of data held may sometimes be limited, but for the individuals concerned it’s always precious. So you need a structure, in terms of people, internal systems, and processes, that keeps data safe, that keeps it updated, and that allows people in the database to gain access easily to their records.

Depending on your organisation’s size, you may want to do these thing through a third party but if you have (big) Big Data aspirations, you will almost certainly want someone to have the role of "data protection champion" or even Data Protection Officer. It’s part of what Boston Consulting Group (BCG) calls good ‘data stewardship’.

Ask the key data privacy questions when new data projects start

  • For this project, have we thought through how individuals’ privacy may be affected?
  • Can we design it not to use personal data at all?
  • Have we got consent for the use we’re putting it to?
  • Are we minimising the amount of data we use and have we got a plan to dispose of it after use?
  • Are we happy to tell customers about it if they ask?
  • Can we easily show them how their data is being used and what the impact is?
  • Have we got the means in place to monitor the privacy impact and respond to feedback?

Against all the possible uses of Big Data, this is quite a demanding list of questions, but one that regulators will increasingly insist upon. As the Information Commissioner’s Office* puts it, in respect of consent to use data, "the complexity of Big Data analytics is not an excuse for failing to obtain consent where it is required".

Exercise reasonable judgement from a basis of trust


Is Big Data doomed, then, to get bogged down in attempts to meet impossible privacy standards? Not in my view. If data processors have followed the three guidelines above, they can – indeed should – exercise judgement on where absolute adherence to rules is not necessary.

Here’s a real-life example from the sensitive world of credit offers. If you have 100k potential customers on your database, but you are reasonably sure from Big Data that 40k of them would get turned down for the credit product if they applied for it, should you screen them out? I think so, because you’re wasting the customer’s time as well as your own. But you may have contravened the rule that says ‘only use the data for the purpose the customer provided it for’.

Then the key point becomes, will your customers accept the judgement you make, because in general your data stewardship has won their trust? That is the outcome that organisations should aspire to.

A BCG study* found that companies which built this trust not only have permission to do more with data, they get far more data in the first place – "the more trusted organisation will be able to access at least five to ten times more data than the less trusted one".

Their conclusion that companies who manage data well "will create better products and services and generate more value for consumers, leading to meaningful shifts in market shares and faster growth" sums up the high rewards for meeting the data privacy challenge.

*References: Xerox-Forrester survey 2015, BCG Perspectives 2014, ICO Big Data and data protection 2014. Additional input from Graeme McDermott, Head of Customer Data at The AA


RECOMMENDED