As we move into the new machine economy, it's time to face up to the end of old-world privacy and define what privacy and trust mean today.
Sorry to break it to you, but unless you’re reading this on parchment illuminated by a whale oil lantern, you’ve been hacked one way or another. By “hacked,” we mean a cybersecurity breach or any other violation of personal data stewardship and governance that results in a breakdown of trust, privacy, ethics, legislation or industry regulation.
It’s no surprise that our new machines – Internet-enabled, AI-fueled – are at the center of redefining trust, transparency and privacy. If you have a password to anything online these days, you’re a user, a product and a target.
But while there’s lots of rethinking to do about the state of privacy as we move into our new machine economy, we are making progress – recognizing our new issues, putting modern guiderails in place and debating what privacy and trust mean.
The Staggering Reality of Our Online Lives
The numbers describing our machine-age lives are astronomic and math-phobia-inducing. Facebook has 1.28 billion active daily users. Twitter hosts 500 million tweets per day. YouTube’s popular T-Series channel, featuring Indian music, has more subscribers than the entire population of Germany.
From these countless interactions, we create a “halo” of data that is every bit as real – and more monetizable, and steal-able – than our physical selves (which we discussed in our book Code Halos).
All that money and information in transit makes attacking privacy and trust a profitable business. According to Shape Security, in 2017 alone, 2,328,576,631 credentials – our username and password information – were “spilled.” (That’s what the boffins call it.) They also found that credential “stuffing” – using stolen data – accounts for 80% to 90% of retailer e-commerce logins. By 2021, the estimated impact of cybercrime alone, not including ruptured societies and nudged elections, will be around $6 trillion (more than the 2018 GDP of Japan). (And crime can pay. There’s already an exchange-traded fund ETF, called HACK, that outperforms the S&P 500 by focusing on cybersecurity companies.)
From Crisis of Trust to Moment of Reckoning
The daily barrage of news about intentional and unintentional use and misuse of our personal data has left many feeling vulnerable, used and even angry, and progress can seem maddeningly slow. The W3C’s Tracking Protection Working Group simply shuttered in January 2019 after eight years of limited progress. Then, just days ago, Apple removed the “Do Not Track” option from Safari to get a better view of customer browsing habits.
A boisterous – even rancorous – debate is gaining in volume and intensity in private companies, legislative halls, regulatory agencies, the media and around kitchen tables over the best way to manage trust and privacy in what the World Economic Forum calls the Fourth Industrial Revolution.
We might think: If we’re all hacked, there’s no privacy anyway. I’m not rich or famous, and I’m a decent human, so what’s the harm? What can I do about it anyway? I’m sure “they” will figure it out for us, right?
Well, no. We’re now facing a period of reckoning where we must wrestle with some fundamental questions that demand responses: What is true? Who can we trust? Who owns the digital me? Is privacy a right or a commodity?
What We Say vs. What We Do
Even as consumers and politicians express outrage, our own behavior doesn’t match our rhetoric. We have a paradox between what we say about security and trust and what we actually do about it.
Consumers agree that trust matters. They don’t want to be tracked on social media, and most say personal ads are unethical. Even members of Gen Z – perceived to be completely laissez-faire with their information – are only slightly more liberal with their personal data, with 2019 Cognizant data showing that nearly 70% are “concerned that companies know too much about them.”
We consistently say we want privacy and that trusting companies matters to us, but when it comes to convenience and cost savings, we’re actually pretty forgiving. Although mega-hacks erode trust, impact stock valuation and – for a time – keep people away, the reputational damage – and valuation impact – so far seems short-lived.
The Truth About Trust
Here’s a hard truth: We are not going back. A handful of people may be able to unplug, but they are the outliers. You and I are not those people.
In spite of trust and privacy concerns, humans simply are not coded to successfully avoid the rush we get from being online. Even for people who actually do try to quit a social media platform, a vast majority return. (Not as high as the 90% relapse rate for smoking, but it’s a lot.)
For all of us, this means:
- Technology is more central to every interaction.
- Trust in all technology is eroding.
- Because of this, trust seems to be draining from our social institutions and even personal relationships.
- Our responses to this, so far, have been pretty anemic.
The Future of Trust?
This puts us at a moment of reflection and inflection. We’ll either find ourselves with a digital economy that is pure weaponized capitalism (where every click is monetizable unless you can afford to be “dark”) or pure big brother authoritarianism (where content and clicks are tightly controlled) or (hopefully) somewhere in the middle.
As a result, we have a lot of work to do. The sea of unknown and questions is vast:
- What is the future of trust? Of privacy?
- How should we govern and regulate data in an economy increasingly driven by new machines?
- How should companies be thinking about encoding greater protection into the technologies that the industry is developing and using?
- How can we foster trust in our institutions and even with each other?
The good news, and we should call it that, is that we are now recognizing the problem. Legislation such as GDPR in Europe is just the first necessary step needed as we renegotiate our terms of endearment with the new machines. In fact, Facebook is now staring down the barrel of a multi-billion dollar fine by the U.S. related to potential privacy violations related to Cambridge Analytica.
Regardless of how this conundrum plays out, what’s clear is that modern democratic societies and forward-thinking companies are taking steps to apply new guardrails to data monetization, privacy and transparency to salvage trust in our increasingly digital interactions.
It’s a lot to wrestle with, but there are a few certainties we can build on:
- We must face the paradox. Hoping our conventional notions of privacy will suffice is naïve at best and, at worst, likely to be scarier and more dangerous for us all.
- We need better rules – via legislation, regulation and industry oversight – that encode the values of our civil societies. (What happens in the UK, China, South Africa, Russia, the U.S. and India will be, and should be, different.) Letting individuals or even commercial organizations try to police this on their own is to abdicate our responsibilities and invite trouble into our homes and businesses.
- We get to decide. We get to determine whether privacy is a right or a product we have to pay for. We get to decide if a social credit score is worth having. I think not, but you might; the point is, we get to work it out together.
Our current debates show democracy is working (slowly, where it’s applied). We’re fighting about trust and privacy and governance, which, in a healthy society, is what we’re supposed to do. A little shouting is a positive sign.
In It Together
It certainly won’t be easy, but we needn’t despair. We have remarkable new tools, huge oceans of data, capital, social constructs, intellectual property and structures of law. Most importantly, we have each other. Global-scale technology makes us more interdependent, so to manage trust and privacy, we have to cooperate – even if we may not agree – or we all fail. This is a solid foundation for solving thorny problems.
In the end, the smart money is on humans making a mess of things at times but moving forward into a future that is always better.