This will allow advertisers to see whether online ad campaigns lead shoppers to make purchases later. Announcing the service, Google claimed that it captures around 70% of credit and debit card transactions in the US.
As we might expect, reactions to Google’s announcements were mixed. Critics argue that it represents another blow to privacy. Should we be alarmed to learn that our internet browsing history, including ads we have seen, can be analysed alongside purchases we make later, in order to offer insights that will help marketers make their advertising targeting even sharper in the future? Do we ask ourselves when and how we gave permission for this sort of analysis of our personal data to take place? Or do we shrug our shoulders and write it off as the kind of intrusion into our privacy that happens daily in the 21st century? We might even dismiss it as harmless; ‘nothing to hide, nothing to fear’ is a position that privacy campaigners like to challenge in the strongest terms!
Companies regularly tell us that they want to get to know us better – and that there will be something in it for us; perhaps a more personalised service, or great offers on products we love. But critics would point out that in order to receive these ‘benefits’, we are paying a price by sharing our personal data, sometimes unwittingly. Are we correct to worry that “if you are not paying for the product, you are the product”?
We live in an era of what is termed “big data”. One of the key ideas at the heart of the “big data” concept is that data from different sources can be pulled together to give a richer picture of me, the customer. The Google service mentioned above is a good example. Yet it seems that many consumers have a belief in their own insignificance. For instance, if I wear a fitness tracker, who on earth would be interested in how many steps I take today? However, if my step data could be linked with other data about me, perhaps the picture changes. What if my step data was merged with data held by the NHS – potentially very sensitive data about my health? What if it was also merged with my internet browsing history (perhaps shopping for other health and fitness products and services) – and what about data on my personal finances – my bank account, my credit card purchases? At what point do alarm bells start to ring for us?
If you have not heard the term GDPR before, you are likely to hear quite a bit about it over the coming months through to May 2018, when a new legal framework concerning our personal data comes into force. One of the GDPR principles states that our personal data must be “collected for specified, explicit and legitimate purposes and not further processed in a manner that is incompatible with those purposes”.
Recall the last time you were signing up to a website, perhaps to do some online shopping. Did you read the terms and conditions before you ticked the box to say you agreed to share your personal data? Research suggests that most of us don’t do so. That must mean that most customers have (at best) a hazy notion of what data we are sharing, with whom – and how it can be used and whether it can be shared with third parties, for example. GDPR will mean that organisations will need evidence that we have given our informed consent when sharing our personal data. One suggestion is that when we agree to a set of terms and conditions, we should receive a “data receipt” that records what data we have agreed to share, with whom, for what purposes etc. But will we study the receipt any more closely than we currently read the terms and conditions?
In a world of big data, we are all “prosumers”, i.e. we are both producers and consumers. Some campaigners are looking for a way in which they can “take back control” of their personal data – arguing that “it’s my data, after all!” As part of this “taking back control”, some would like to “monetise” their data – in other words, shouldn’t companies be prepared to pay me for my data, rather than accessing it for free? And if my aim is not to benefit financially, perhaps I would be happy to share my fitness tracker data (for example) with medical researchers or my favourite health charity, so that my data can benefit others in the future. We are surely likely to see more “data donation” of this kind in the years to come.
Other important principles within GDPR include the “right to rectification” and the “right to erasure”. In other words, if I want to correct some data that a company holds about me that is inaccurate or incomplete, or perhaps I want my personal data to be removed from a database altogether – I should be able to make this happen, without any great difficulties along the way. Have you ever tried to “unsubscribe” from an email list, and struggled to do so? This will be another challenge for organisations as GDPR comes into force: how to correct or remove personal data promptly and efficiently when the “data subject” (that’s you or me) want this to happen.
We may be getting used to hearing the idea that the internet is enabling companies to adopt “new business models”. A favourite example is Uber, which has changed the way that many of us travel by taxi. Campaigners around privacy and surveillance have raised concerns about the impact of these new ways of doing business on our privacy. But no one has answered the question, what does a privacy-friendly business model look like? Where are the examples of companies putting customer privacy at the heart of how they operate? Surely, as we become more aware of the issues around sharing our personal data with anonymous organisations, will we see more consumers demanding a value proposition that reflects their privacy requirements?
For more information on the project ‘Monetize Me: Privacy and the Quantified Self in the Digital Economy’ (supported by the EPSRC), please contact email@example.com.