Making Conversations Human

Conversation (UI) Component Design

How Does One Sound 'Human'?


Designing for conversations all come down to the details - words.

Every word, phrase or sentence evoke emotions, and increase engagements or cause drop-outs as a result.

As part of an experimental project at a bank's innovation lab, I was working on a chatbot that helps users discover the best offers in town, based on their likes and preferences.

Iterating the Conversation Design


I joined the project mid-way and was tasked to improve 3 functionalities, particularly for the onboarding and user tasks.

The chatbot's main conversation flow (or primary user journey) asks users to choose from a set of parameters such as location, price and time. These choices effectively serve as filters for the results.

If the user is not satisfied with the result, they can repeat the process and answer any one of the parameters again as they like.

Main conversation flow illustrated (For the change language functionality, there are two possible paths. You will see more on this later.)



Comparing Conversation Components


As digital interfaces are composed of UI components, conversations are composed of conversation components.

Depending on the platform, their components vary. Comparing Facebook Messenger and Slack components in their closest forms:

Facebook Messenger 'Quick Replies' vs Slack 'Interactive Message Buttons'


Facebook Messenger 'Templates' vs Slack 'Actions'


For more information on conversation components, have a look at their documentation and I also highly recommend this book to Designing Bots.

It's important to first study the available components and know your constraints such as character limit, interface layout and button styles.

For my project, the platform was Facebook Messenger.

User Consent Agreement

Given the importance of data privacy and security, user consent is typically required and designed as part of onboarding.

On Facebook Messenger, there are two components to the onboarding - the welcome screen, with a 1000 character limit, and the chatbot’s introduction screen.


Facebook Messenger's welcome and introduction screen examples

Given these two component options, I could insert the user consent agreement in either the welcome or introduction screen.

I mocked up two versions and validated it with a small sample of users:

Version 1: getting user consent in the welcome screen


Version 2: getting user consent in the introduction screen

The final verdict was version 1. Having the user consent agreement on the welcome screen makes the experience feel less clunky and more up front.



Change Language

In most digital experiences, the 'change language' function is typically nested inside the menu or settings. Adhering to user expectations, this is the first place to have it.


When Language Default Fails

Hong Kong is a relatively bi-lingual market, but Cantonese (a dialect of Chinese) is still the primary language over English. As a result, global brands like Google defaults our online experiences to Cantonese.

For me, my Google account preferences are all set in English. But whenever I open a Google Maps link or connect with my Google account for logins, the default language is Cantonese.

Luckily, I am a bi-lingual. But for English speakers, how would they know how to change languages if it’s in a foreign language?


An example of connecting with Google Login where the default language is Cantonese for Hong Kong

The first touch I introduced was making the change language option bi-lingual.

Change language setting is now available in two languages

Another common path to changing languages is at the beginning of a conversation.

Making use of Facebook Messenger's 'Quick Reply' component, I tested the sequence of changing language either before or after user consent.


Testing the sequence of changing languages either before or after user consent

This was an easy one. Users preferred version 2, which was more straightforward. They can change languages via 'Quick Reply' and then agree to the user consent.



Share Location


One of the search parameters is to share current location to discover offers nearby.


User searching for offers nearby by sharing location

In mobile contexts, users tend to search on-the-go. It's easy to get distracted when crossing the street, avoiding pedestrians or replying to friends' messages.

After sharing, Messenger caches the location. But when you are moving around, this cached location is no longer real-time and becomes inaccurate for any subsequent searches for directions or time to destination.

To manage the misleading results of 'Get Directions', I requested our developer to change the Google Maps API to call the location only and renamed the CTA to 'Locate Deal' instead.


View location and up to users if they want to search for directions

The result is just a pinned location for the offer and users can click 'Directions' to search from their current location if needed.

Learnings

The most interesting part of this experience is learning to prototype contextually.

By traveling to the offers' locations, I was able to test its real-time accuracy and that's how I really understood the context of the search experience - the relative chaos when typing on your phone while walking, replying to friends' messages or crossing traffic lights.

Designing for conversations has really sparked my interest in UX writing and I look forward to taking on more similar challenges.




Check out more portfolio stories: