All is not well in the global talent arena. The digital “skills gap” that emerged earlier this decade is widening into a chasm. According to International Data Corp’s Futurescape 2019 report, two million jobs in artificial intelligence (AI), the Internet of Things, cybersecurity and blockchain will remain unfilled by 2023 due to a lack of human talent. Some experts claim the only solution is a structural reset focused on how individuals learn. Most agree that the transition won’t be easy.
That’s because the skills gap has deepened over time. It started in 1964, when the International Association for the Evaluation of Educational Achievement fielded the First International Math Study (FIMS), which ranked student math proficiency of students in 13 developed countries. The U.S., which finished last, was already experiencing a skills imbalance. The first signs emerged in 1942, when the U.S. War Department’s Army General Classification Test indicated that 40% of Americans aged 17 to 24 had the cognitive ability of an 8-year-old.
By 1983, officials in the Reagan Administration were so concerned that they commissioned a report entitled “A Nation at Risk,” whose ominous conclusion warned: “Our once unchallenged preeminence in commerce, industry, science and technological innovation is being overtaken by competitors throughout the world. If an unfriendly foreign power had attempted to impose on America the mediocre educational performance that exists today, we might well have viewed it as an act of war.”
Alarmed by this finding, education administrators and politicians used the report to usher in the era of standardized testing, seeking accountability for the nation’s investment in public education.
But times have changed. Rick Miller, president of Olin College of Engineering, proclaimed back in November 2014 that “we live in an age of just-in-time learning facilitated by powerful online search engines, and in the workplace of the future, what one knows will be less important than what one can do with what one knows.” (Stated during a November 2014 speech at Olin College.)
Who will fill the gap?
And yet, in a recent study by Cognizant’s Center for the Future of Work (CFoW), only 27% of business executives claim their employees have the skills to work or interact with top emerging technologies, such as AI, big data/ analytics, IoT, mobile technology, open APIs and cybersecurity.
That is a huge skills gap. And corporations aren’t necessarily looking to higher education institutions for help. In the CFoW study, 67% of business leaders said they’re concerned about the effectiveness of higher education institutions to prepare the workforce of the future.
Although AI skills are needed right now in the workplace, it could be 2025 before many college students will find an AI course in their school’s content syllabus. Higher education institutions, after all, only refresh their curriculum every two to six years, according to the CFoW study.
No wonder, then, that roughly six in 10 companies are beginning to bear the burden of learning for their employees, according to the study, whether by overhauling their corporate learning and training development programs (65%), increasing their investment in reskilling (62%) or offering specialized training on emerging technology (60%).
That’s encouraging. But many chief information officers (CIOs) with whom I speak are reluctant to fully embrace these kinds of programs.
They remain convinced that once they reskill employees in an emerging technology area, they’ll add this skill to their resume and head off to a different employer.
I disagree with this contention, and thankfully, so do many forward-looking business executives who, according to the CFoW study, are prioritizing skill enhancement programs for workers in robotics/ AI (82%), human-centric skills like communication, collaboration and problem-solving (80%), tech skills/web design/UI design (73%), project planning (67%) and discrete tech skills in STEM disciplines (63%).