In 2016, the Center for the Future of Work released The Business Value of Trust, a report highlighting how and why established brands lose consumer trust. Unfortunately, the consumer trust situation is only getting worse. Over the past few weeks, there has been a flood of media headlines on Facebook and Cambridge Analytica regarding the misuse of consumer data, seemingly indicating that while people do care about privacy after all, they still have no clue what it means or how to handle it.
The latest “Quit Facebook!” frenzy recently escalated when Elon Musk removed his Tesla and SpaceX’s pages from Facebook, inevitably causing a large crowd to follow suit. In fact, my latest Google search revealed 76,90,000 results for “How to delete my Facebook account” and 53,70,000 results for “How to protect privacy on Facebook.” So why wasn’t everyone this concerned about privacy before? And will the real issue of privacy be addressed by joining the #DeleteFacebook campaign? I really don’t think so. Hopefully, when the dust surrounding the ongoing Facebook fiasco settles down, those who deleted their accounts will get back to connecting: Sharing the contents of their lunchtime sandwich, the details of their relationship status, the latest information on their pets and families, and the pictures from their travels, all while downloading free apps and almost everything else imaginable. So, will data misuse and privacy issues occur again? It undoubtedly will (although not necessarily from Facebook)—in fact, as the digital economy expands, data privacy and trust issues will only multiply.
In today’s society, with everything from shopping to dating happening online, it is easy to wonder if data privacy really exists. At the heart of today’s data privacy debates there are a number of assumptions that need to be dispelled: 1) that digital regulations will address privacy-related issues 2) that developing self-control will help protect our privacy online and 3) that companies should be more cognizant of our privacy. Unfortunately, none of these assumptions are really accurate and currently our only hope for increased privacy in the short-term lies on the other side of the pond with the General Data Protection Regulation (GDPR), which will take effect in the European Union on May 25th. However, while the establishment of digital regulations is important, I am hesitant to agree that they should be the only resort utilized to protect consumer data.
On a daily basis, many people feel that companies are accessing personal information that they did not explicitly provide. Yet, in spite of all these concerns, very few people have sworn off the Internet entirely. While we generally voice a desire for privacy, we are also very open with the information we share about ourselves, a conflict which seems to have become a permanent fixture in our everyday lives. Social media platforms have become critical in maintaining social relationships, and there is no way we can turn our backs on them. Let’s face it: Although we claim to want our privacy, we tend to focus more on the benefits we’ll get out of our online activities than on the risks we take by engaging in them.
Perhaps, as Wired magazine founder Kevin Kelly rightly said, “When given a sliding scale from anonymous to fully tracked, we are usually more likely to let ourselves be tracked. Why? We know we will gain a more tailored experience.”
Over the next five years, the notion of “privacy” will undergo a radical change, and perhaps what is seen as the unethical today will be acceptable tomorrow. As consumers become more educated about how a company is using their data, they may be willing to assume more risk in exchange for value in the form of personalized experiences, discounts, or coupons. In fact, this kind of trade-off might just be the new norm of privacy in the future.
There is no Privacy without Trust
Without trust between consumers and service providers, there is no common ground for privacy in this digital world, especially when companies use personal data in ways consumers were not expecting. In an age when personal data is the key to honing a competitive edge, data ethics is at the heart of business success. Unfortunately, many companies believe they have done their duty by publishing data privacy and security policies, but many consumers claim, “These policies are Greek to us!” and simply skim over the “Terms & Conditions” before pressing the Accept button. Communication is a two-way street, so merely stating your organization’s policy and then hiding behind the law will not create a sustainable level of trust. Instead, if a company is transparent about how it intends to use consumer data I think consumers would be more willing to share it. When companies share responsibility for and show an interest in minimizing risk, consumers become more likely to trust.
As transparency evolves as the new competitive differentiator, keeping customers informed about data usage policies in a language that they can understand will become increasingly beneficial. If you want to delve further into the trust crisis and how to fix it, you can read more discussions here and here. As the digital revolution unfolds, trust will become even more important because consumers will not just assume, but rather expect that businesses have put their consumer interests above everything else.