Trust-Chain-767x368

Who do you trust? Companies, big data and trustworthiness

By Prof Maureen Meadows, Dr Carlos Ferreira, Dr Alessandro Merendino, Centre for Business in Society – Faculty of Business and Law

As a customer, are you happy when a business shares your personal data with another organisation, perhaps to make money? Do you trust that business, or do you feel obliged to share your data with them because you want the product or service?

As customers who are invited, or even required, to share our data with companies if we wish to buy a product or service, we might ask ourselves whether the organisation we are about to deal with is trustworthy or not. For instance, if we want to buy a train ticket, we have to agree with the ‘terms and conditions’ of a train operator such as Virgin Trains. But is Virgin Trains a trustworthy organisation?

Researchers[1] have argued that trustworthiness involves three key dimensions:

Organizational Trust.” The Academy of Management Review 20 (3): 709-734.

  • Ability refers to the perception that an organisation has a certain set of skills to carry out its mission and achieve its goals in a competent way.
  • Benevolence refers to the perception that an organisation has a genuine concern for the well-being of its stakeholders, such as customers.
  • Integrity refers to the perception that an organisation will adhere to a set of values and principles that are acceptable to stakeholders.

If we believe that a company has competence, benevolence and integrity, we are perhaps more likely to agree to share our personal data. So, if firms want to gain access to our resources, such as our data, they might seek to gain our trust – the trust of individual customers who represent their present and potential future stakeholders. For instance, if we perceive that TicketMaster, the ticket sales website, is competent, benevolent and honest, we are perhaps more likely to provide sensitive data, such as our bank details, address and date of birth.

Image from: Silicon Republic

Image from: Silicon Republic

Trustworthiness raises questions about the organisation’s future actions. Do we feel we can rely on a particular company, in terms of its future decisions about how to use our data? We may perceive an organisation to be untrustworthy when its actions arouse suspicion.

Image from: Beger&Co

Image from: Beger&Co

Examples of companies that appear to lack competence, benevolence and/or integrity are all too plentiful. On April 18th 2019, it was reported[2] that Facebook had admitted to “unintentionally” uploading the address books of 1.5 million users without consent. The company said that it will delete the collected data and notify those affected – but in the minds of many users, its trustworthiness must be once again in question. And on 12th April 2018 it was reported[3] that the parenting club Bounty has been fined £400,000 – one of the largest penalties possible – for sharing its data with marketing agencies without users’ permission. Bounty offers support and advice to new parents who sign up through its website and mobile app, or are directly recruited on maternity wards; surely many users will feel that its actions indicate a lack of benevolence and integrity, if not competence.

Image from: nextgov (in Tweak Library)

So, if it is in the interests of organisations of all kinds to build trustworthy relationships with their customers – how can this be done? How can an organisation convince us that it possesses competence, benevolence and integrity?

Image from: Tweak Library

Image from: Tweak Library

Last month, we saw an interesting example of an organisation trying to do just that. On 10th April, the New York Times (NYT)[4] surprised many readers by announcing that it was re-thinking its policies and practices around data. Under the label “The Privacy Project”, the paper was open about the fact that it makes money by using customer data to sell advertisements and subscriptions, often working with other companies like Google and Facebook. It went on to assure its readers that it is reflecting on how it collects, uses and shares data about them. Under its new initiative, the NYT claims that it is reducing the volume and type of data that it shares with social media companies, for example – but it also admits that its website uses cookies or other trackers to study internet use. The NYT says that it hosts these trackers for three purposes: to learn about how people use their website and apps so that the customer experience can be improved; to reach readers that they hope will subscribe; and to sell targeted advertising. However, it insists that it maintains clear internal guidelines about how data is collected, used and shared with third parties.

All of these arguments will be familiar territory for many organisations around the world. One important question is whether customers are happy that the potential benefits of such a relationship – possibly receiving advertising that is better targeted to the individual, and a better customer experience – are sufficient ‘reward’ for present practices of data sharing and use by the organisations that they deal with.

Image from: Tweak Library

Image from: Tweak Library

Many companies are wrestling with similar trade-offs, and would suggest – as the New York Times did – that they are doing “the best they can” within a digital ecosystem that needs reform. The NYT argued that change needs to be driven at a societal level — by politicians, leaders of major technology companies and the public at large. Perhaps the strongest driver of behaviour change in firms will be the latter, i.e. companies needing to avoid negative reactions from customers. We are already seeing examples of customers engaging more cautiously with companies they do not trust, or boycotting firms whose actions indicate a lack of trustworthiness, for example the campaign[5] to #DeleteFacebook.

As customers, we have become accustomed to providing our personal data to organisations – and to observing that our data is often monetised. Going forward, organisations will be under greater pressure to reassure us that they are trustworthy – to demonstrate that they have ability, benevolence and integrity – before we are happy to share our data with them.

 

 

 


 

[1] Mayer, Roger C., James H. Davis, and F. David Schoorman. 1995. “An Integrative Model of

[2] https://www.theguardian.com/technology/2019/apr/18/facebook-uploaded-email-contacts-of-15m-users-without-consent?CMP=Share_iOSApp_Other

[3] https://www.theguardian.com/technology/2019/apr/12/parenting-club-bounty-fined-selling-users-data?CMP=Share_iOSApp_Other

[4] https://www.nytimes.com/2019/04/10/opinion/sulzberger-new-york-times-privacy.html

 

[5] https://www.campaignlive.co.uk/article/one-20-brits-delete-facebook-accounts-cambridge-analytica-scandal/1460836

 

Comments

comments