In the Chinese city of Zhengzhou, a police officer donning facial recognition glasses spotted a heroin drug smuggler at the train station and arrested him. Now, just for a minute, shut your eyes, and imagine an officer of her majesty’s government donning said glasses at the Glastonbury festival (or Coachella for that matter) scanning the crowd. I suspect it leaves you a tad uneasy, no? Maybe, it is just me.
Then there is another Chinese city, Qingdao, where surveillance cameras powered by AI have helped the police nab 20 plus criminal suspects during a beer festival held there (Qingdao was once a German colony, hence the beer festival). Or in Wuhu, a fugitive murder suspect has been identified by a tiny camera when suspect bought his food from a street vendor. Today, China is the world’s biggest market for security and surveillance technology. Government contracts are fuelling R&D into technologies that track our faces, clothing and even the gait of our walk. With millions of cameras and a gazillion lines of code, China is building a high tech authoritarian future. The state is beginning to govern by algorithm—and it might sound scary to us, but it’s happening here in the West as well, by stealth (the Chinese are at least open about it)
In the West, serious questions are being raised in the wake of recent reports that police and government agencies in the US are increasingly gaining access to facial recognition tools that allow them to identify individuals quickly and in contextual situations. So, not just from mug shots, but when they are in the moment, on the street, in crowds, or captured from above by a drone (yep, a bit like “Blade Runner”). As these questionable uses of facial recognition technology multiply, people are beginning to react accordingly: Employees at Amazon, Google, Microsoft, and Salesforce have fired ethical broadsides at the leadership, urging them to articulate how their firm’s facial biometrics technology is being deployed around the world. Who is using it and for what purpose?
Like me, I suspect we all want to is ensure that the technology never violates people's basic human rights. So, with exquisite timing, the UK government has just published Five Basic Principles to Keep Humans Safe From AI with a desire to shape AI positively for the human good. This opening salvo from an ethical/regulatory perspective will be the first of many from governments and companies around the world (just as the CFOW predicted with the rise of the “Chief Trust Officer” as a viable profession in our 21 Jobs of the Future report). How does the affect the future of work?
Well, back at the start of the year, Indian Prime Minister Modi nailed the point at Davos in Switzerland. On the main stage, he said: “the one who controls data, will be the world leader.” It’s sage advice from the leader of the world’s biggest democracy and prescient for what is happening today. The news reports drip feeding on China, the US, and the recent tumults of Cambridge Analytica, Russian hacking, etc. firmly place data and its control and ultimately, trust, center stage. This matters for the future of work and how a company in the future captures value from work, from the data streaming off the processes that touch us all. The CFOW has been writing on these themes for some years now, from Mani Bahl’s early call in The Business Value of Trust to our latest ideas set out by Ben Pring and Rob Brown in Every Move You Make, Privacy in the Era of the Algorithm, published on the eve of Europe’s big data push-back, GDPR.
I urge you to read Ben and Rob’s report: GDPR it is the first serious attempt to put some markers down around data, and the report maps out an effective response. We are living in an age where exceptionally high volumes of data — both structured and unstructured — are now captured routinely by technology. While large amounts of data by itself may not be valuable, data refined through algorithms can produce efficient results, revenue opportunities but also rather alarming ways to control behavior. We in the West might hide our willingness to mine citizen data, but according to what I’ve read, China is the other way around. For from hiding their efforts the Chinese authorities’ regularly overstate their capabilities. In China, even the perception of surveillance can keep the public in line. This is a new way for the government to try and control its people. Modi is right.
PS. On a lighter-hearted note about the surveillance, tech is creeping into my life. Last week, at 11 pm, I was in Zurich; when I got a WhatsApp from my son (back in the UK): “How’s the kebab Fatboyslim?” WHAT!!?!. How did he know? I’d just been to see France play Belgium and I nipped into the only place that was open for a spot of food at that time (Zurich shuts down at 9 pm). I may be shattering visions of eating at Baur Au Lac but how did he know? It turns out that the surveillance system called Google had kicked in and he was able to track me on my travels across Europe.