How to Avoid Walking on the Dark Side of Data Privacy
The big-stakes issues stirred up by data privacy risks and pitfalls are in the headlines (and our Twitter feeds) every single day. Because while our Code Halos have brought us great joy, happiness (and a surfeit of cardboard boxes) — and given rise to a new class of business leader (Bezos, Zuckerberg, et al) — their “dark side” is also becoming more apparent, and the potential for dystopian nightmare is increasingly palpable.
It’s a subplot that’s sidetracking the possibilities of the digital age and, in particular, the burgeoning opportunities of artificial intelligence (AI).
Meanwhile, with Europe’s General Data Protection Regulation (GDPR) now in effect, the first phase of the Internet — the Wild West Days — has come to an end, and a new world order is emerging, based on how companies and governments respond to the issue of privacy. For business leaders around the world, the work for today and tomorrow is to get the right ethics baked into their business models — while there is still time.
Indeed, without privacy and trust — elements that have historically given people confidence in commerce — the digital economy as we know it will fall apart.
New guidance from the Cognizant Center for the Future of Work examines this important moment of time and provides analysis, advice and insight for organizations around the world that need to respond to the changes ahead. Here’s a brief summary of the findings:
“Privacy today is dead; long live privacy.” What privacy means now — and has meant in the past and will in the future — will continue to be intensely debated. Our contention is that despite assertions to the contrary, privacy is not dead; if it were, then the fervor over Facebook wouldn’t be such a big deal. The public outcry has been so loud, and the white-hot heat lamps of a congressional testimony so glaring, that it’s a first real wake-up call to members of the public regarding their level of exposure. The strategic challenge for the future of the digital economy will be to keep data open and free but, simultaneously, protected. This vision hinges on a balance of legislation and business ethics.
GDPR is nigh — learn it, live it, love it: The EU’s privacy framework may represent the stuff of your red-tape nightmares. But the fact is, all companies will need to master the new algebra of personally-identifiable information mandated by GDPR to win in the future of work. And GDPR is just the beginning of more similar codification of privacy laws. Californians will be voting in November 2018 to allow users to request information on what data is being gathered by companies about them, how it’s being used and whether they can opt out of collection and usage in the future.
The absence of trust will lead to antitrust: Trust has underwritten the rich history of liberal economies, and has long been a crucial element for making the wheels of commerce spin. The basis for trust — such as generally accepted conventions and a common understanding of “the truth” — however, has been called into question of late. Quaint notions like “a person’s word is their bond” have given way to dark data, fake news and deep-fakes-gone-wild, which will accelerate government legislation and antitrust measures. Getting on board with regulation, like the proverbial apple a day, may help keep antitrust away.
Prepare for the Internet to become the “Splinternet”: Attitudes toward data privacy appear to be splintering among different regions of the world, namely Europe, the U.S. and China. Which view will win out? Or will all three co-exist? Geopolitical responses to issues of security, trust and data sovereignty will define what our world looks like, how it operates, who wins and who loses, over the generations to come.
Give your customers — and all their data and metadata — a delete button: In addition to regulations, now’s the time for business leaders to dig into Ethics 101 to help fill the potential moral vacuum. In our 2014 book Code Halos, we outlined a path forward. Among these recommendations is allowing people to “check out,” that is, easily delete their data (and metadata) anytime they want. Other guidance: Offer complete transparency into customer data; maintain favorable “give-to-get” ratios when it comes to giving something of value in exchange for customer data; and apply a good dose of self-control over how you use data.
Consider #deletefacebook as your preemptive pressure test for privacy: The raging debate over privacy is bigger than GDPR and codifying rules. It’s fundamentally about pressure-testing ethics in the digital age. So as companies go forward, what lessons can every company learn from the 2018 #deletefacebook movement? Think of it as "the Durbin question." In congressional testimony recently, Senator Dick Durbin asked Facebook CEO Mark Zuckerberg if he’d share the name of his hotel where he stayed the night before and the names of people he’d messaged in the last week. “Senator, I would probably not choose to do that publicly here,” Zuckerberg said. To which Durbin responded, “I think that might be what this is all about: your right to privacy.” Maybe your own personal attitude to how your data, and data about you, is used will be a crucial determinant of how and where you want to live in the future.
What You Can Do to Restore Data Privacy
In the digital era, privacy’s impact on shareholders and stakeholders alike will be equally important. The enduring lesson is that you can’t bet the brand by playing poker with customers’ privacy. With that in mind, here is a succinct list of six actions — three things to start doing, and three things to stop doing — to help data privacy flourish in the digital age.
Privacy in The Age of Algorithm
Listen to Podcast
In this conversation moderated by Gary Beach, publisher emeritus of CIO magazine, and a columnist with the Wall Street Journal’s CIO Journal, our panel of experts lay out what privacy means in the age of the algorithm, why it is not dead and highlight how the data privacy fight is playing out across the global internet.