Jan 21, 2021 8:09:00 AM
The battle of bots: NLP vs. multiple-choice chatbot
One bot to serve them all – what a powerful idea! Hold your horses now, this does not mean that chatbots will replace human agents. Ever.
They rather complete each other perfectly by dividing the workload effectively. The emphasis here is on one bot which can serve your customers in the best way possible. Let’s dig deeper into the chatbot sphere and have a look at the different species out there.
Why chatbots provide a great customer service experience
Customer service chatbots have been around for a while and they are here to stay. They are the embodiment of instant communication and key to keeping your customers engaged and happy by providing a great service experience. Why? Let’s have a quick look at the most important aspects of a good customer service experience according to the Zendesk Customer Experience Trends Report 2020:
- Resolve issues quickly
- 24/7 support availability, i.e. answers in real time
- Friendliness of agent
- Contact via desired method
- No need to repeat information
- Proactive reachout for support
- No need to contact a human agent
A chatbot meets all of the criteria above. Any further questions?
However, the provided service experience can only be as good as the bot. And this is directly related to how the bot is built:
A well-built bot is your first line of defense, a poorly built bot is a ticking bomb.
Yet, it is not all about good bot, bad bot, but rather about the way customers can interact with it based on the bot’s technical backbone.
Because not all chatbots are the same. When it comes to interacting with a bot, the customer has mainly two options: either using natural language processing (NLP) or multiple-choice questions. Let’s dive into both chatbots to evaluate their strengths and weaknesses and furthermore investigate their potential opportunities and risks for customers and the service landscape.
What are NLP chatbots?
Natural language processing - or NLP - means the bot is trained with machine learning to understand a human’s intent by identifying patterns used in human language, and respond to them with predetermined answers. In practice, this means that the user will type messages to the bot and the bot will try to understand them and send an appropriate reply from its repository.
It’s biggest advantage is equally its biggest disadvantage: free text input. It looks and feels to the customer like chatting with a friend, because it is a familiar mode of communication - aka the Eliza effect. Therefore the bot seems trustworthy and is intuitive to use, offering a good UX which is crucial for a world-class customer service bot. It basically adds a human touch to the bot. But also sets high expectations.
At the same time, free text input equally marks the chatbot’s weakness: Because it is not a human, it will not always understand the customer correctly. The error margin here is quite vast, resulting in misunderstandings or no resolution at all because the chatbot does not get the real intent or the nuances of the conversation. This creates a really frustrating experience for the customer.
Yet, the chatbot, or rather the algorithm behind it, will learn from every single interaction, successful or not, offering the company and customer service team insights on what its customers are looking for. This helps to improve the chatbot for future requests by applying machine learning. But it is a long way to get there a hundred percent.
What are multiple-choice chatbots?
On the other end of the spectrum, the interface can rely on multiple-choice questions, and they have been mainly applied to chatbots with a rule-based, static decision tree. This means the bot asks a series of predetermined questions, fixed to the conversation branch. This approach unfortunately means the interface is often long and tedious: as the chat flow is fixed, the user has to go through every option - even the ones that do not apply to them. (A classic example, while not text-based, are the decision trees used when calling service hotlines: “Press one if X; Press 2 if Y”.)
But there is also a second mode for multiple-choice chatbots to provide solutions to the customer: a dynamic flow, which is generated for every conversation. This model provides more flexibility than the rule-based approach since it is more adjustable based on a combination of training data and mathematical calculations. These dynamic decision trees lead customers to the right solution much quicker and more precisely than the rule-based option.
Although predefined choices limit the users, they also provide more precise solutions as they cannot misunderstand the user. Those kinds of chatbots define the topics on which they can support the customer very clearly from the start of the conversation on. The downside is that the solution the customer is looking for might not be included in the repertoire of the chatbot. But then again, a handover to an agent, e.g. via live chat which enables free text input, is always possible.
The best of both worlds: Combining NLP and multiple-choice
Behold, there is a third option, combining both structures in one bot: dynamic multiple-choice questions for ease and simplicity, and machine learning for speed and precision. This means that using machine learning, the chatbot applies conditional probability to suggest the next question, and thus learns from every interaction. As a result, the dynamic chat flow improves over time. On top of this, some products offer the alternative to converse with the bot via typing (Natural Language Processing), for a more emotional, engaging experience.
This option offers the possibility to actively guide your customers and let them explain the details while making sure that the chatbot is context-aware and able to precisely understand your customers. The approach of this unique Contextual Conversation EngineTM is to offer both options simultaneously, suggesting relevant choices while enabling the users to directly type in their requests which then again results in the chatbot offering tailored options to choose from.
The benefits of a chatbot interface are vast - automating the most repetitive support tickets frees your team of a large part of their workload, creating the time and space needed to service single customers better, and focusing on those harder-to-crack cases that require special dedication and attention – aka the human touch. Therefore, an agent is always just one message or click away and is empowered by the conversation history to guide the customer seamlessly to the right solution. On top of this, enabling the user to write freely and the bot to ask more questions enhances the impression of having a real conversation, making it more meaningful.
Beyond text: NLP meets Voice
Is this already the kind of bot that serves them all? Not quite yet. With the power of NLP comes another cool feature which further sparks the engagement of the customer with the chatbot: speech input. Just like you tell Siri and Alexa to dim the lights or to order something online, you can tell the chatbot. Pretty neat, isn’t it?
With the added option of voice a whole new world opens up to instant contact customer service. Like messaging, it mimics the way your customers already communicate with their friends and family everyday, which makes it all the more desirable for them to use. This communication mode is especially important to Gen Z and their demand of convenient support, preferably with as little human interaction as possible. Voice is a channel that currently only 10% of companies offer, but within the next 12 months it will be twice as much according to Zendesk. Stay tuned.
Karen takes care of Solvemate's content universe as content & communications specialist. When not writing about chatbots, you will find her watching Danish tv series (Dear Netflix, please talk to DR and add some new ones!), doing (aerial) yoga or trying out every recipe from Yotam Ottolenghi.