All is not well in the global talent arena. The digital “skills gap” that emerged earlier this decade is widening into a chasm. According to International Data Corp’s Futurescape 2019 report, two million jobs in artificial intelligence (AI), the Internet of Things, cybersecurity and blockchain will remain unfilled by 2023 due to a lack of human talent. Some experts claim the only solution is a structural reset focused on how individuals learn. Most agree that the transition won’t be easy.
That’s because the skills gap has deepened over time. It started in 1964, when the International Association for the Evaluation of Educational Achievement fielded the First International Math Study (FIMS), which ranked student math proficiency of students in 13 developed countries. The U.S., which finished last, was already experiencing a skills imbalance. The first signs emerged in 1942, when the U.S. War Department’s Army General Classification Test indicated that 40% of Americans aged 17 to 24 had the cognitive ability of an 8-year-old.
By 1983, officials in the Reagan Administration were so concerned that they commissioned a report entitled “A Nation at Risk,” whose ominous conclusion warned: “Our once unchallenged preeminence in commerce, industry, science and technological innovation is being overtaken by competitors throughout the world. If an unfriendly foreign power had attempted to impose on America the mediocre educational performance that exists today, we might well have viewed it as an act of war.”
Alarmed by this finding, education administrators and politicians used the report to usher in the era of standardized testing, seeking accountability for the nation’s investment in public education.
But times have changed. Rick Miller, president of Olin College of Engineering, proclaimed back in November 2014 that “we live in an age of just-in-time learning facilitated by powerful online search engines, and in the workplace of the future, what one knows will be less important than what one can do with what one knows.” (Stated during a November 2014 speech at Olin College.)
And yet, in a recent study by Cognizant’s Center for the Future of Work (CFoW), only 27% of business executives claim their employees have the skills to work or interact with top emerging technologies, such as AI, big data/ analytics, IoT, mobile technology, open APIs and cybersecurity.
That is a huge skills gap. And corporations aren’t necessarily looking to higher education institutions for help. In the CFoW study, 67% of business leaders said they’re concerned about the effectiveness of higher education institutions to prepare the workforce of the future.
Although AI skills are needed right now in the workplace, it could be 2025 before many college students will find an AI course in their school’s content syllabus. Higher education institutions, after all, only refresh their curriculum every two to six years, according to the CFoW study.
No wonder, then, that roughly six in 10 companies are beginning to bear the burden of learning for their employees, according to the study, whether by overhauling their corporate learning and training development programs (65%), increasing their investment in reskilling (62%) or offering specialized training on emerging technology (60%).
That’s encouraging. But many chief information officers (CIOs) with whom I speak are reluctant to fully embrace these kinds of programs.
They remain convinced that once they reskill employees in an emerging technology area, they’ll add this skill to their resume and head off to a different employer.
I disagree with this contention, and thankfully, so do many forward-looking business executives who, according to the CFoW study, are prioritizing skill enhancement programs for workers in robotics/ AI (82%), human-centric skills like communication, collaboration and problem-solving (80%), tech skills/web design/UI design (73%), project planning (67%) and discrete tech skills in STEM disciplines (63%).
But here’s the rub. These approaches to upskilling are often grounded in 20th century learning methods such as instructor-led classes rather than on-the-job-training, e-learning and video learning. If there’s a looming shortage of two million workers for jobs in emerging technology areas, where are companies going to find the competent instructors to teach them? I was also surprised to see learning approaches based on AI (28%) and augmented reality (19%) far down the list. That’s another skills gap to reckon with.
A study by global recruitment firm Harvey Nash offers further insight into how CIOs are strategically dealing with the skills gap. Respondents were first asked about which tech areas are most impacted by a skills gap. Responses included big data (46%), enterprise architecture (36%) and security (35%).
For me, the key question in the study is this: “Which method do you use to find the right skills?” Rather than innovative reskilling, responses ranged from “using contractors/consultants to fill the gap” (85%), “using outsourcing/ offshoring to supplement internal teams” (71%) and “using automation to remove the need for headcount” (67%).
If corporations aren’t interested in reskilling workers for emerging technologies, and higher education institutions are reluctant to change their insular business models, what options remain for workers looking to learn emerging technology skills?
The answer? Look in the mirror.
A new approach to learning has emerged in the past five years called “digital badges.” They work like this: Imagine a 25-year-old is interested in learning more about digital engineering or AI. With the digital badge model, this person signs up for a course and completes the curriculum, mostly online. Rather than be awarded a certificate suitable for framing in the office, the person is given a hyperlink to a digital badge administered by the organization offering the course.
The digital badge holder can embed the link in his or her profile on social media sites like LinkedIn, or when responding to open roles on job sites like Indeed. Prospective employers can simply click on the digital badge link to verify the applicant’s skills and course accomplishments. Verification of skills and competency is the hallmark feature of digital badges; this separates them from traditional e-learning initiatives.
Scott Bittle, director of communications at Burning Glass Technologies, says digital badges address two skilling challenges: Employers need a more precise way of determining whether potential hires have the required skills, and workers need to earn these credentials in short training sessions that are both quicker and cheaper than a traditional degree.”
Kathleen deLaski, founder of Education Design Lab, says digital badges have gained a lot of traction quickly, but “we need corporate hiring managers to give clearer signals to validate these as credentials.”
A recent study from iCIMS of 1,000 technology hiring executives offers three findings that suggest these “clearer signals” are emerging:
Digital badges have shortcomings. Most notable are the lack of industry standards for course quality or the amount of personal commitment required to earn one. But from what Roger Schank, founder of Experiential Teaching Online and former chief education officer at Carnegie Mellon University, tells me, “In the end, credentials mean what we think they mean. ‘I’m a high school graduate’ used to mean something; now if you bragged about that, you would be laughed at. The real issue is what one has actually done and being supported by any credential that means something to corporations. The future belongs to digital credentials.”
What’s clear is that companies and higher education institutions are not doing enough to bridge the widening technology talent gap. Frank Gens, senior vice president and chief analyst at IDC, offers this ominous prediction: By 2023, the global economy will create 500 million new native applications – the same number created in the past 40 years. To compete in that environment, Gens says, C-suite executives “must consider everyone a developer.”
Francois Voltaire, a 17th century French philosopher, wrote, “One day everything will be well, that is our hope. Everything is fine today, that is our illusion.” With a shortage of 900,000 emerging-technology workers looming in a global digital economy seeking to roll out 500 million native apps, all is not well. This is especially so in an ecosystem in which “talentism is the new capitalism,” as Klaus Schwab, co-founder of the World Economic Forum, says.” Business and technology leaders know this firsthand.
It’s foolish to continue believing that higher education institutions and corporate training programs grounded in traditional 20th century approaches will offer meaningful solutions to this talent gap.
In the Fourth Industrial Revolution, individuals can no longer primarily rely on higher education institutions or corporations to learn new skills. While it’s incumbent on these entities to restructure how they train and teach, that isn’t happening quickly enough. In the interim, it will be up to workers themselves to relearn how to learn, or rely on organizations that offer more agile and less costly approaches to upskilling.
As the landscape of work continues to shift in the digital era, all participants – higher-education institutions, corporations and workers themselves – have a role to play in making the future of work and the future of learning a reality.
Gary Beach is the Publisher Emeritus of CIO magazine. He is also a guest columnist for The Wall Street Journal and author of the best-selling book The U.S. Technology Skills Gap. He can be reached at Garybeachcio@gmail.com | Twitter @gbeachcio.
To learn more, read the full Cognizanti journal, Navigating the digital age: What senior leaders worldwide have learned from pushing the boundaries of change, or contact us.