Traditionally, knowledge management (KM) applications have focused on building repositories that help companies gather, segregate and circulate such assets as documents and graphics. These applications will remain useful, even necessary, for the foreseeable future — but forward-looking enterprises are taking the next step, strengthening their KM ecosystem in order to get the most out of artificial intelligence (AI).
In particular, robust KM is necessary for companies that are implementing conversational AI — the systems that facilitate human-to-machine conversations, commonly known as chatbots. At this point in the chatbot maturity curve, we see many businesses rushing to adopt a point solution. But that’s a mistake. A chatbot purchased in haste, merely to keep up with the competition, stands to cause as many problems as it solves and, from our observation, will not unlock sufficient value. Indeed, according to Gartner, “Through 2020, 99% of AI initiatives in IT service management will fail, due to the lack of an established KM foundation.”
Here, we explain why enterprises seeking to harness the true potential of conversational AI should first create a superior knowledge ecosystem.
Chatbots: Reducing costs, improving the customer experience
According to Gartner’s Market Guide to Conversational Platforms, only 4% of enterprises have deployed conversational interfaces. While chatbot adoption is in its very early stages, the upside is considerable. BI Intelligence estimates that companies can save up to 30% in customer support service costs by offering a convenient, low-effort way for customers to find information. Organizations are then able to scale contact center operations by leveraging reduced operational expenditures.
Additionally, chatbots can improve the user experience by allowing customers to multitask — obtaining the assistance they need in a chat session while simultaneously working or even making calls. Because nearly all conversational AI systems are built with mobile computing in mind, they work as well on phones and tablets as they do on PCs. Thus, untethered users can be helped, for instance, while riding the train or attending a conference.
KM: Foundational for chatbots
One mistake we see frequently: business and IT leaders discuss conversational AI as if all systems work with the same high level of effectiveness. Nothing could be further from the truth, and that’s where a solid KM ecosystem enters the picture.
When responding to a customer-initiated interaction or query, a chatbot typically starts by asking a question or sequence of questions to ascertain the user’s intent. To do so, the chatbot uses natural language processing (NLP) to analyze the customer query. It then uses that analysis to make its own query to a stand-alone KM solution (or knowledge base) for the appropriate content, answer or solution.
On the front end, this user experience must be, ideally, seamless, making a call or email to the contact center unnecessary. A well-formed KM strategy enables this by:
Standardizing and structuring information with decision-based workflows. Traditional KM platforms are a bit like organized chaos. They have different authors, each contributing diverse content in different writing styles. Some of that content is bound to be irrelevant, incomplete or conflicting — especially as time passes. For the same end-user question, different answers may be provided depending on the bot’s ability to interpret. Decision-based content workflows integrated with specific nodes in the system of record assist in finding the right content in the least response time. This further helps the bot effortlessly orchestrate query processing.
Identifying and removing debts across a customer interaction journey. Customer interactions addressed by the chatbot are tagged to outcomes and journey points and are classified as either a query, a transaction or an appeal/complaint. Debt categories (in this context, “debt” refers to the root cause of a customer’s dissatisfaction) are then derived, since each interaction occurrence has an associated debt reason defined in the KM system. Real-time analysis of debts helps understand issues related to process, people and technology, while also identifying corrective measures.
Bringing about personalization. A well-formulated KM ecosystem makes a customer interaction journey individualized, adaptive, impactful and scalable. Using the healthcare industry — which requires great tact and sensitivity to customer needs — as an example, Figure 1 illustrates how member profiling at different health stages would lead to a tailored response by a chatbot. In the example, the knowledge engine has identified an inquiring member as being in the “acute” health stage based on pre-determined criteria (current and past diagnoses, sentiment scores, health stage and the interaction reason). This proactive identification triggers the decision-based workflow in the knowledge database. The chatbot is then prepared with a customized script, a history of recent activity, and other prompts that help it address the member’s needs instantly, greatly reducing the possibility of a call.
Creating a continuous feedback loop to enable automated version control of missing information. A real-time feedback mechanism ensures timely corrective action and helps avoid the passing of incorrect or redundant information to the customer. The KM strategy is formulated to account for this in a three-step process:
Error notification: If a chatbot cannot get the required information for an ask, it sends an auto-notification with the error description to the content development team.
Error analysis: The content development team is represented by training, quality assurance and operations. They identify the error cause. If new content must be added or existing content modified, then an analysis will gauge the impact on cross-functional upstream-downstream processes.
Error resolution: Finally, a content writer works on the request. The knowledge database is updated, with a quality check performed by the content development core team.
Facilitating shared services and driving economies of scale. A cloud-enabled KM infrastructure supported with chatbots can help businesses form a shared service organization in a fast, cost-effective manner, for several reasons. After all, chatbots aren’t required to be cross-skilled on processes, as long as the knowledge ecosystem is updated and structures information in a decision-based workflow. And chatbots can support 24x7 global operations, since geo-specific dependencies on working hours is eliminated. Language translators integrated with NLP and knowledge applications allow multilingual service; business continuity planning due to political unrest and natural calamities isn’t required; and no physical infrastructure is required for running operations — since it is governed predominantly by cloud systems and software bots.
In the end, any bot is only as good as the intelligence driving it. To consistently feed chatbots effective content, companies need a system that continuously captures, reuses and extends the organization’s knowledge in a structured, efficient way. With such a knowledge framework in place, it is far easier for chatbots to not only match users’ intent to the exact content they need, but also to expand their capacity to do so as operations ramp up.
Point solutions, no matter how attractive they appear at first glance, cannot work effectively until properly conceived and implemented KM is in place; until then, chatbots are at risk of becoming lifeless shells that diminish the customer experience and contribute to the very scalability issues that they’re designed to address.
This Perspectives article was written by Himadri Sarkar, Director, Consulting.