During our May meet-up “Designing Chatbot – Part II”, Elvia Vasconcelos, senior UX designer at SapientRazorfish. She delivered a really useful talk on the process of designing chatbots. She detailed how to think through a design before its conception.
She explained key concepts in artificial intelligence and even gave detailed practical advice on how to design your own chatbot. Elvia documented her talk in this article and we decided to share a part of it here! Without further ado, here are Elvia’s tips for building a successful chatbot!
This talk is the product of a goal I set for myself late last year: to have a practical toolkit for UX designers working with chatbots.
Before you Start
The outcome of these 3 presentation sprints is this talk that I delivered Mobile UX London. It looks at the tools and methods I would use if I was to start a new chatbot project tomorrow.
Caveat — this is based on my learnings and what I have experienced first hand. Most likely I will get something wrong and say something stupid but that’s all right because you live and you learn.
David talked about designers wanting to resolve problems by slapping screens onto everything instead of thinking why we are using the tech in the first place. He urged us to think about the problems people face and to take it from there. In reverse, he also talked about what are the things that conversational interfaces are good for. And how we can use that to make the case for it being the right solution or not.
Shey Cobley brought it home by telling us that:
- Success starts with people and not tech
- To focus on the places where people are already having conversations.
- To think hard about What is the problem you are trying to solve and why is a conversational interface a good fit for it?
What I got from that evening was that our responsibility as a designer is to challenge the tech solutionist approach that we are routinely faced with when we start a new project.
Furthermore, Elvia highlights two important actions that should be done before starting a new project. Firstly you should gather a team, hopefully with a content strategist and copywriter. Secondly, you should create a team atmosphere where critical feedback is welcomed. And which will facilitate team communication and help your product be the best it can possibly be.
Understanding Artificial Intelligence
Building your knowledge around the key concepts and definitions in AI.
Looking at definitions very quickly turns tricky because nobody can agree on an absolute truth.
AI was coined at a conference in 1956 by American scientist John Mccarthy. In short, AI refers to a machine that mimics cognitive human functions such as learning and problem-solving.
Natural language processing is part of AI and refers to the ability of a computer program to understand human speech as it is spoken or written. NLP is also one of the biggest challenges for conversational interfaces because of accents, grammar, slang, different languages, you name it.
And final definition, a Conversational interface has been defined as any UI that mimics chatting with a real human. There are two types of conversational interfaces:
1. Chatbots — written input
2. Voice assistants — respond to speech
On the right I have placed a few of the players in the field, like Watson, OK Google, Slack etc. There’s a lot more.
This is my simplified view of what natural language processing is. I have used Api.ai’s documentation for my definitions.
There’s a person that speaks, the NLP platform translates human into the machine, the application works as a rules engine that executes actions and gives feedback to the user.
An intent represents a mapping between what a user says and what action should be taken by your application.
This is how you structure input and output into rules.
Final definition: entities are reference values that are mapped into natural language.
I think there is some work to be done in making these definitions clearer, but I understood it as entities are the actionable words in a sentence.
e.g. I am looking for a flight to Copenhagen.
The flight will be an entity. The synonym might be trip, travel etc.
In past experiences I found myself hacking conventional UX tools/methods to better suit working with conversations. The reason for this is that the Design process is led by the logic map which feeds into the code, and the copy structure, that also feeds into the code.
If I was to start a project tomorrow I would sit down with my team and play with a few bot tools. I would try a few, encourage the discussion so that we could together, come up with the right combination of tools that would suit us.
I’ve listed a few examples from really visual drag and drop style interfaces, to templates, to more promising sounding content management style ones. I’ve been compiling my list of tools on Evernote.
Elvia explains that a decision tree tool is extremely important in helping to map out the different inputs and outputs of your bot. She recommends some visual vocabulary tools as well.
4. Interactions Design
I always fall back on this diagram by David Armano as I find it always helps in conversations around prioritisation.
I interpret this as a way to break experiences into building blocks and to prioritise them from absolutely critical (useful) to more complex dimensions of experience like the social. Although not necessarily linear, it makes it obvious why we should focus on nailing the basics first (useful and usable) before moving onto delivering other aspects of the experience. Most products and services are still there.
The key is moving from simple to complex.You should first make sure that the key actions it is supposed to perform are working and only then move on to more complex aspects of the product.
Additionally, she suggests focusing on the starts of conversations, as users often feel uncomfortable or do not know how to engage with the bot:
If we synthesize the things that are stopping people from starting a conversation into user needs, we can start designing to meet those. In our case we identified that 1 — people wanted to know what this was, 2- what it did and 3-how they could use it.
So we opened with an introduction, a clear purpose and followed with how it can help them and how they can use it. We used buttons to make it even faster for people to start engaging with the bot. This tested very positively.
No matter what kind of chatbot you are creating there will be generic scenarios that apply to all of them. Like Hello, goodbye, existential questions like ‘Who are you?’, ‘Where are you?’, responding to provocative questions, handling abuse etc.
Machines don’t understand humans because of all the nuances in language. There will be lots of errors.
We came up with a fairly simple, 3 level approach:
- First message, “I’m sorry, I don’t know anything about < the last thing they typed>”
- Second and third levels will vary to acknowledge that this has happened before and to offer the user different options to manage this and ultimately to speak to someone if they are getting nowhere.
This pattern generated really positive responses because it acknowledged that users were struggling in succession. They felt more comfortable to try again once they knew they could reach out to a live agent at any point.
There’s plenty of situations where users will want help.
e.g. they hit a dead end, something has gone wrong, something didn’t happen as expected.
We addressed this by adding permanent help i.e. an icon and by programming the chatbot to respond to things like ”I need to speak to someone”.
3. Starting over
In the absence of common navigation, we knew we had to create a loop back to the beginning throughout the conversation.
We started by the easiest situation, when users had completed their task, we offered them a way to start again. We then mapped the other situations where users might want to start again and sprinkled that option in the conversation.