How to build a chatbot interface
Chatbots are being discussed at all Generate conferences this year. In New York on Friday CapitalOne’s Mindy Gold will talk about what it means to design for conversation, while in San Francisco on 9 June Huge’s Sophie Kleber will explore how to design emotionally intelligent machines. Finally, at Generate London in September Giles Colborne will analyse opportunities as well as pitfalls of conversational interfaces. Get your tickets today.
In the mid-2000s virtual agents and customer service chatbots received a lot of adulation, even though they were not very conversational, and under the hood they were merely composed of data exchanges with web servers. Almost a decade later, chatbots are the latest form of artificial intelligence to cause a stir in the tech scene. Even though a huge number of examples of this ‘weak AI’ exist (including Siri, web search engines, automated translators and facial recognition), with major investment from big companies, there remain plenty of opportunities to hack the future.
As well as being trendy, chatbots can actually be very useful. They don’t need to feel like a basic replacement for a standard web form, where the user fills in input fields and waits for validation – they can provide a conversational experience. Essentially we’re enhancing the UX to feel more natural, like conversing with an expert or a friend, instead of web browser point-and-clicks or mobile gestures. The aim is that by providing empathetic, contextual responses, this technology will become embedded directly in people’s lives.
What follows is a very practical way to design and build a chatbot, based on a real project-intake application in a service design practice. As this practice serves over 110,000 members globally, the goal was to provide a quick, convenient and natural interface through which internal stakeholders could request effective digital services.
We designed the project intake application so that anyone can make a project request with simple text queries, instead of having to fill out confusing forms in the company’s web-based task management software. The chatbot became the initial client- facing approach, which enabled the team to focus on business goals like lowering operating costs and increasing efficiencies.
Personality and UI
The first step was to establish the chatbot’s personality, as this would represent the voice of the service design team to its stakeholders. We built on Aarron Walter’s seminal work on design personas. This greatly helped our team develop the bot’s personality traits, which then determined the messages for greetings, errors and user feedback.
This is a delicate stage, as it affects how the organisation is perceived. To make sure we had as much information as possible, we immediately set up stakeholder workshops to nail the appropriate personality, colour, typography, imagery and user’s flow when engaging with the bot.
After we had gained all the necessary approvals – including seeking legal counsel – we set out to convert archaic request forms into a series of back- and-forth questions that mimicked a conversation between the stakeholders and a representative of our design services team.
We knew we didn’t want to get too deep into AI markup language for the processing part – we just needed enough to jump- start the experience. RiveScript is a simple chatbot API that is easy enough to learn and sufficed for our needs. Within a few days we had the logic down to intake a project request from the bot, and parse it with enough business logic to validate and categorise it so it could be sent it through JSON REST services to the appropriate internal project tasking queue.
To get this basic chatbot working, head to the RiveScript repo, clone it and install all the standard Node dependencies. Note: In the repo you can also gain a taste of the interactions you can add with the various example snippets. Next, run the web-client folder, which turns the bot into a web page by running a basic Grunt server. At this point we can enhance the experience to suit our needs.
The next step is to generate the ‘brain’ of our bot. This is specified in files with the .RIVE extension, and thankfully RiveScript already comes with basic interactions out of the box (for example, questions such as ‘What is your name?’, ‘How old are you?’ and ‘What is your favourite colour?’). When you initiate the web-client app using the proper Node command, the HTML file is instructed to load these .RIVE files.
Next we need to generate the part of our chatbot’s brain that will deal with project requests. Our main goal is to convert a selection of project tasking intake questions into a regular conversation.
So, for example:
Hello, how can we help?
Great, how soon do we need to start?
Can you give me a rough idea of your budget?
Tell me more about your project …
How did you hear about us?
A typical accessible web form would look like this:
With web forms we’re very familiar with certain patterns: you click the ‘Submit’ button, all form data is sent to another page where the request is processed, and then most likely a cheeky ‘Thank you’ page pops up. With chatbots, we’re able to take the interaction of submitting a request, and make it more meaningful.
Designing a voice
To convert this form to a conversational user interface served in RiveScript’s chatbot web client, we need to convert the information architecture from rigid to fluid; or field labels into UI strings. Let’s consider some accessible field labels, and their related question tone:
- Request: How can we help? Not sure? Do you mind if I ask a few questions?
- Timeline: How soon do we need to get started?
- Budget information: Can you give me a rough idea of your budget?
- Project description: OK, can you tell me a summary of the problem to be solved?
- Reference: Also, who referred you to us?
Next we need to convert the web form’s code into AI script, following RiveScript’s very learnable processing logic for two-way conversations:
As opposed to standard form variables being sent to another page or service to process, chatbots can validate and submit information entered by the user in a chat window (or spoken) immediately, which means users can also revisit previously entered values easily. We needed to send the user’s request entered in the chatbot UI via the JSON REST API to an external project tasking server.
In RiveScript-js we are free to make use of an XMLHttpRequest object to submit the request almost simultaneously, as the data is entered by the user:
Soon, current ways of interacting with computers to obtain information will give in to AI-based technology like chatbots, where people just make simple voice commands or text queries. The web design community need not fear – we should all be embracing the added value of this new technology. It could be a game-changer for the companies it works for, offering fully scalable customer service and improved customer intelligence.
This article was originally featured in net magazine issue 290; buy it here.
If you want to find out more about designing chatbots, don’t miss Generate this year. The San Francisco conference on 9 June also features presentations from speakers at Netflix, Airbnb, Uber, NASA, Twitter, Microsoft and more.
Generate London, meanwhile, is preceded by a day of workshops covering design and content sprints, user experience strategy, building scalable responsive components, and how to sell your idea to stakeholders.
- How to design a chatbot experience
- 5 essential chatbot learning resources
- How conversational interfaces are innovating banking