Traditionally, knowledge management (KM) applications have focused on building repositories that help companies gather, segregate and circulate such assets as documents and graphics. These applications will remain useful, even necessary, for the foreseeable future — but forward-looking enterprises are taking the next step, strengthening their KM ecosystem in order to get the most out of artificial intelligence (AI).
In particular, robust KM is necessary for companies that are implementing conversational AI — the systems that facilitate human-to-machine conversations, commonly known as chatbots. At this point in the chatbot maturity curve, we see many businesses rushing to adopt a point solution. But that’s a mistake. A chatbot purchased in haste, merely to keep up with the competition, stands to cause as many problems as it solves and, from our observation, will not unlock sufficient value. Indeed, according to Gartner, “Through 2020, 99% of AI initiatives in IT service management will fail, due to the lack of an established KM foundation.”
Here, we explain why enterprises seeking to harness the true potential of conversational AI should first create a superior knowledge ecosystem.
According to Gartner’s Market Guide to Conversational Platforms, only 4% of enterprises have deployed conversational interfaces. While chatbot adoption is in its very early stages, the upside is considerable. BI Intelligence estimates that companies can save up to 30% in customer support service costs by offering a convenient, low-effort way for customers to find information. Organizations are then able to scale contact center operations by leveraging reduced operational expenditures.
Additionally, chatbots can improve the user experience by allowing customers to multitask — obtaining the assistance they need in a chat session while simultaneously working or even making calls. Because nearly all conversational AI systems are built with mobile computing in mind, they work as well on phones and tablets as they do on PCs. Thus, untethered users can be helped, for instance, while riding the train or attending a conference.
One mistake we see frequently: business and IT leaders discuss conversational AI as if all systems work with the same high level of effectiveness. Nothing could be further from the truth, and that’s where a solid KM ecosystem enters the picture.
When responding to a customer-initiated interaction or query, a chatbot typically starts by asking a question or sequence of questions to ascertain the user’s intent. To do so, the chatbot uses natural language processing (NLP) to analyze the customer query. It then uses that analysis to make its own query to a stand-alone KM solution (or knowledge base) for the appropriate content, answer or solution.
On the front end, this user experience must be, ideally, seamless, making a call or email to the contact center unnecessary. A well-formed KM strategy enables this by:
In the end, any bot is only as good as the intelligence driving it. To consistently feed chatbots effective content, companies need a system that continuously captures, reuses and extends the organization’s knowledge in a structured, efficient way. With such a knowledge framework in place, it is far easier for chatbots to not only match users’ intent to the exact content they need, but also to expand their capacity to do so as operations ramp up.
Point solutions, no matter how attractive they appear at first glance, cannot work effectively until properly conceived and implemented KM is in place; until then, chatbots are at risk of becoming lifeless shells that diminish the customer experience and contribute to the very scalability issues that they’re designed to address.
This Perspectives article was written by Himadri Sarkar, Director, Consulting.