(upbeat music) – All right, last session of the day, two amazing days. I just wanted to say thank you for those who decided to stick around.
I promise it’s engaging, if not then you can ask Josh for your money back, okay? (laughing) All right.
(upbeat music) – [Announcer] All of our representatives are currently busy. Please stay on the line.
And your call will be answered by the next available representative.
The estimated hold time is currently less than 96 minutes. (audience laughing) You are currently caller number 32 waiting to speak with a representative.
Thank you for your patience.
(soft music) – I imagine everyone in the room at some point whether it’s talking to your bank, your telco, moving house, energy provider has felt that kind of anxiety, that that kind of message has put you in.
It’s a really hard problem to solve and I just kind of wanted to jump in.
Look, am not advocating for removing call centres. Call centres, I think, are here to stay.
But I think call centres are being kind of, left behind or kind of, the very last thing designers kind of, being touched on.
We looked at websites, apps, social media, these are the front lines of your businesses. Your great on boarding experiences, great check out experiences.
But when it comes to dealing with problems, we just throw customers into this bucket and we’re like they can’t just call us, or to kind of, Amy’s point, virtual assistance that can handle that kind of stuff. So I’m here to talk about, how can we find balance? I think call centres will all be around, and as designers, I challenge you, how can we improve that experience? Now, meet Julie.
Julie is a modern day customer.
She is time poor, she is a digital native.
Lives her life on her phone.
Busy social life, busy working life.
And some things have changed from the traditional, sort of customers, expects an amazing personal experience. Values experience over price chasing.
Doesn’t stay with the same bank, with the same rate for 20 years like her parents did.
Trusts the opinions of her friends and other people online rather than just marketing talk that is rammed down our throats today.
And most importantly, because of the world that Google has set up, where you can search for an answer in 30 seconds or less, is expectation economy. She demands instant resolution today.
And this is a problem for businesses.
How can we deal with this customer, Julie who’s expecting, you know, “I’m important, I’ve given you “money to my business, how are we gonna solve this?” I work for a company called ARQ.
We solve human problems with technology.
And I’ve been working with Qantas, for about three years now on how can we actually help them.
One of the world’s biggest brands.
Their call centres over Christmas, people wanna go visit their families, flights are always delayed.
It’s not always Qantas’s faults, we say sorry. But, you know, we need to figure out a better way to handle this than just dumping someone into a virtual system path or adding this long wait queue.
And this is kind of a story of how we, our learnings over the last two to three years and take you through that.
So, Gartner released this research about two years ago. It’s called the path to personalization, and it’s four pillars in terms of customer experience. In order to get a really amazing personal experience of your business, you need to do each of these four steps.
So, the first one should be pretty simple.
Customers are happy to give you their data as long as you do something interesting with that data. That will actually give them utility back.
So why not as businesses collect that data feel like this, being a part of the theme over the last two days, store it, sync it and then service it in other parts of the business that add value. Obviously don’t break your customer’s trust. Data is important, privacy, that kind of stuff. Understanding the context of the channel so customers come in, if they call your business, it’s a very intimate thing when someone picks up the phone. How many people just text now? Like calling to someone on the phone is quite an intimate thing.
They want something resolved.
If they’re coming through a voice channel.
If they’re coming through SMS, social media, WhatsApp. You gotta understand the channel.
Traditionally, most people still browse and purchase tickets online on desktop ’cause they compare with all these comparison sites. But when it comes to other parts of the user experience, whether it’s like checking into your flights and stuff or showing your boarding pass, people have now moved into mobile.
So understanding that the channel of choice is important here as well.
And now you move into the Know Mes.
This is where personalization starts to get exciting. You understand the customers needs and wants, and you can pro-actively present them with suitable messages.
At Qantas, we know that when a customer has landed in the first two hour windows of that flight that’s landed and they’re not transferring to another flight, nine times out of 10, it’s got something to do with baggage, damaged baggage, lost baggage.
So, knowing this information, and once the customer is kind of, whether it’s in the app or calling us or whatever, let’s surface pro-actively the right message, especially when we’re looking at the data, with the sync me piece.
And we can start surfacing things like starting their insurance claim early.
And this is kind of where the personalization starts to get, in my mind, amazing, exciting.
This is the wow factor.
This is kind of AI ML, I know these are kind of buzzwords but when you start looking at patterns and business, customers start trusting your business to act on their behalf.
Following the example of the lost baggage, someone has turned up, they’ve called Qantas or they’ve WhatsApp us, or whatever it is, and they’ve gone, “Hey, you’ve lost my bags,” it happens.
Of course, we’re going to acknowledge it.
Let’s solve the problem.
Why don’t you go to your hotel by the time you get to your hotel, we’ve already raised your insurance claim and there is a $500 Visa gift card to buy some new clothes. And by the time you get to your hotel’s concierge, and the rest of it, we will find out where your bags are while you’re waiting comfortably in your room and not being anxious at the airport.
‘Cause flying, you know, when things go wrong, it is a terrible kind of experience.
So, we’re thinking of ways how to kind of improve it. And that’s were I see personalization really starting to, with the power of AI, ML, we can start doing this at scale. I just want to clarify, a lot of businesses and there’s no call out here, it’s kind of a global thing.
Yes, the internet turned around and you know, people going great, let’s put up bots, a call centre, sort of, chat-to-us widget on our website.
And they go, let’s get people off the phones, let’s just divert them to our website, and then you end up with this scenario.
And this happens so often.
That’s a terrible Band-Aid solution.
If you’re going to go down that path you need to think about the customer experience and where can you add value and where does it make sense.
So when we looked at this problem holistically, we knew that people weren’t always, like the call centres weren’t going away, people will always going to call Qantas.
And we go, great.
And we started looking at some of our data. And we started unpacking and we go look, some customers prefer to self-serve, others actually wants to be guided to the answer and it’s a nice blended version of both.
So when we started experimenting with virtual assistance, and there’s a whole programme now at work at Qantas called Virtual Assistance.
We said, look, virtual assistance, because the technology restraints and everything aren’t going to resolve every single problem. And depending on the customer’s channel of choice and the context of what they’re calling, all we wanted to do is, guide customers to either self-serve themselves, virtual system will kind of send them into the right path because we may have features like the baggage calculator exist but could be hidden somewhere in the website rather than on the homepage.
Or we can actually talk to a human where necessary. And the whole point is, free up all the commonly asked questions, the most commonly called up question is, “reset my Qantas pin.” It is like crazy amount of people just call up and go, “I can’t reset my pin,” yes, you can just go to the website hit forget my password, go through those steps. The amount of people who then are waiting in sometimes hours on really busy days just to reset their pin is mind blowing.
So, if we can free up the agents and get customers deal with the most common easy to solve problems. Then you get agents that handle more complex queries like complex bookings, prams, complex seating, that kind of stuff like that.
So, really important when you think about the context of when technology should replace the human and where the human should step in.
Now, some of these lessons, I know we use this example. I feel like we needed to double up on our slides. But I wanted to point out, designing for human conversation, especially with virtual assistant is really, really, really hard and when you look at the transcripts, if you think you’ve designed every possible way the bot could handle the conversation, someone will come in overnight and ruin the conversation and you will get stuck in this sort of, infinite loop of death.
I just wanted to point out the blue is where the user is talking, the gray’s the bot. In the first response, the bot goes straight into this kind of gimmicky thing.
“Are you on a boat?” What if the user didn’t speak English as their second language? Sorry, as their second language.
Why can’t we just be so simple and concise going, “What is your suburb?” “What is your location?” You didn’t have to come up with this kind of weird thing, I’m on a boat, what’s that got to do with the weather. Like, to me I think you need to get the basics right before you start adding personality.
Another example Cody, sorry not Cody.
This is a guy, was using Siri to transcribe a text message to his wife, you know, about dinner or something and then bots are dumb, they’re inputs and outputs. He didn’t tell Siri to stop recording.
He picks up his trombone and starts playing. (laughing) And believe it or not, the people at Siri have actually transcribed audio sounds of different instruments into human words.
And that’s what we got there.
Since he published this tweet, if you look at the whole kind of response, people got out every possible instrument and just started, you know, playing with Siri, google assistant and trying to work out which instruments work. Violins and bass guitars do not transcribe well. Trombones work quite well.
So, interesting world, what they were thinking there, but I like that example.
Now, how do you design a conversation? How do you get started? This took us about six months to learn, and I’m happy to admit it because we said, something as simple as flight status.
There’s hundreds of different utterances that someone could ask for a flight status or in this case ordering a coffee.
You could say, “Can I have a coffee?” “Can have a soy latte?” “Can have a long black?” So, we worked out, if you actually think about the outcome that the user wants to get to and then work backwards, answers before questions. It makes it easier for you to, kind of, get your head around all the possible ways and break it down into smaller bite-sized chunks to design on.
Some jargon, that I always like to throw in. But when you’re doing conversational UI, these are three things you need to learn.
You’ve got intense, entities and utterances. And by that, the intent is the outcome.
In this case, ordering coffee.
Utterances is what the user types in or asks it over the voice, and then you could have, you got some examples of all the different versions and your entities are your variables.
Now, think about ordering coffee, it’s pretty simple. “Hey, would you like a coffee?” You could respond, “Yes” or “No”.
What if you responded, “I’ve already had three today.” And the time of day is like 9 PM, like, would the bot know how to handle that kind of context. If it becomes complex really quickly to the point where, like I said before, it’s really hard to understand. You can guide the conversation using UI on the screen or follow up questions.
So, if someone goes, “Can I order a coffee?” And you go, “great, what type of coffee?” And you can actually present in your sub-buttons going long black, soy, et cetera. And then you know, “Do you want sugars?” “Yes, no,” and then you can kind of guide the user, it’s kind of like a wizard.
Think about it like that or choose your own adventure. Again, another example that was used just before, but I wanted to kind of show you.
This is the tweet that went out that ruined Cody for Telstra.
And I apologise anyone who works here as well. As you can see here, the user Paris, he wants to talk to a human he’s called it out multiple times, the bot goes, “Great, before I transfer you “please tell me your first name.” And he says, “Paris”, naturally.
And the bot goes, “Uh, Paris, international roaming,” and then he gets stuck.
And he goes, “No, I want to talk to a human.” And again, he says, “Before I transfer you, “tell me your first name,” “Paris.” Stuck again, in this infinite loop.
What they should have done is, in that response, you can actually codify what the input is. In this case, first name, whatever the response would then get handed over to the human live chat software. And would not interact with any of the other intents or utterance.
The training phrases that kind of trigger the handover. How we got around that? We call these conversational trees and these become quite complex and they can become quite big. So, but these are kind of your mind maps, easy user journeys that the user could kind of flow through. So in this case, you come in, you got the welcome message, we’ve presented the most commonly asked questions, questions around frequent flyer baggage, manage booking, et cetera.
User can free type text, you know, around baggage or they can just choose one of the options. They choose baggage.
You can imagine at an airline like Qantas, there is dozens of questions around baggage and we have answers for everything.
The question is, are those answers in contextual? Can we surface it at the right time? We’ve again looked at the data of all the previous millions of conversations of live transcripts, and we’ve kind of organised it from the most popular to least popular.
And then depending on where you go from there, there are options in, you know, it can go in any direction. And for business as big as Qantas there are answers to everything.
Sometimes the bot can resolve it, sometimes you got to hand over deep linking to the app, or the website, sometimes you got to transfer to a human. You need to explore all these conversational paths. And when you do this right, then you avoid the issue where the user gets stuck in this infinite loop. Greetings are important.
Now, remember, when the first iPhones came out, and how complex those UIs were kind of to use and going even one step further.
And when the first iPods with a circular sort of navigation, a lot of people took them a long time to get used to a new sort of way of dealing with a new technology. So when you’re dealing with virtual asistants, and again, to your point and talk, we need to unlearn the behaviours of point and click.
We found that if you have, tell users up front what they’re dealing with, in this case a robot.
Tell them what it does, and then give them the options. This is the quickest way that a user would actually give us a high NPS rating.
And this works successfully well.
Having said that, we constantly iterate on the versions. And these are three of the different versions, I believe. The one on the far right is the current version. And then we go one step further.
And then in the context of channel of choice inside the app, because the people who use the app are frequent flyers more frequently.
We don’t need to give them such a complex intro, we actually divert them into the different problem area. And these are the three most common questions always come up issues before they fly, after they fly or questions around the frequent flyer problem. And then depending when they go, they can get handed to the different contact centre, live chat agents in the business or the different bots. And we’re currently building really sophisticated bots around one particular area.
Could be frequent flyer or baggage.
And that’s kind of where we see the future as, imagine an ecosystem of 20 little bots who are specialising in a particular area.
And then they fit under sort of, an umbrella organisation. And that’s kind of where we see the future of this technology going.
Another painful lesson for us, but thankfully for us, we jumped on this one early but it’s never leave me hanging or never leave your customers hanging.
And it could be as simple as saying, “Is there anything else I can help you with?” You could say, “No thanks,” “I need help, transfer to a human.” We know, we tried multiple different iterations on this. The simple message always works the best.
Going back to the Poncho Weather app example. Don’t go crazy with this, I’m on a boat, kind of stuff. Just make it concise.
He should have said, “what’s your location, what’s your suburb?” For us, we’ve just kept it simple, “Is there anything else I can help you with?” And that seems to resonate most with customers. Now that you’ve got those basics right, you can add personality, you can add flair. Air New Zealand, “Lord of the Rings”, All Blacks rugby team, in this case I’ve added “Kia Ora”, nice simple welcome. But if you don’t get the basics right, like the amounts of people who go to the Qantas Facebook page and go, “My meal’s arrived cold” or “What meals are going to be served on my flight?” We decided, okay, great, we’re going to add that into our bot.
Looks like they did the same thing here.
But in this case, it’s fallen short.
So users will, come and find all the possible ways to chat to your bot.
Where it got exciting, and we spent about a couple weeks on this, when we first got started is, I tasked my entire team of designers, developers, UX QA testers, go find 100 bots, put them all into a spreadsheet and say everything to it.
Tell it I love you, start swearing at it.
See how other businesses with their tone of voice actually resolve different queries.
In the middle, you’ve got an example where we swore at CNN and it just said that’s unexpected, but it didn’t follow up the user.
But if you look in the top right, you’ve got Target Air, there’s negative sentiment.
And then it goes, “I’m sorry to hear that, “let me transfer you,” the option to the contact centre. So they’ve got a nice sort of experience for hand over. When you go live with your virtual assistant, and people realise they’re talking to it, they will type the weirdest stuff in there. Some people don’t realise that they’re actually talking to a bot, but other people know and they’re trying to personally like, game the system.
The stories when you pick up the call centre and you got the, press one to go to Team aid, press two people just press 1-1-1 or 0-0 and that eventually, some of these business of automated rules where they transfer to a human immediately. Other businesses have built similar sort of rules, where if you throw a negative word like “bomb” to an airline bot nine times out of 10 will transfer you immediately to a human, because these are called danger words.
And then you have really nice users.
One user, he chats the bot.
He’s a frequent flyer, he tells it good night. So we saw him, we didn’t have a response and we saw he said it about 200 times over six months. And we decided to build a custom goodnight message just for him as well.
So you can kind of think about those experiences for your users.
Those nice little small talk options.
So we started with that terrible announcement, “You are 96 minutes, 32nd in line,” we’ve come full circle, depending on your, Qantas gives you channel of choice. You can call us, you can text us, you can go through the website, the app, doesn’t matter. If you want to talk to a human, we can call you back, you can hand over to human immediately.
What we found is actually when, especially when people call and like I said, call centre is not gonna go away, we’ve given the option to kind of hand over to the app or SMS immediately, and to self-serve or to hand it over with the live chat agent.
Especially when you’re on the go.
You could be travelling on the chat, on the train, you don’t want to drop out of reception on the way to the airport.
But with persistent chat, chatting to a human agent over text, you can actually resolve some of those queries and the user doesn’t have to exit the call or something like that.
So they can continue the conversation over a period of hours if they need be.
Life gets in the way.
And does it work? So I’ve put out some of the stats that I can publicly talk about it.
In sort of the first six months, our Facebook Messenger bot not only became the most popular last GN, in sort of the Facebook ecosystem, we had a million conversations on it.
So, with all that data, we’re constantly looking at how to improve the experience, we look at all the different transcripts with the most popular buttons or icons that are pressed on, how the responses working. You need to be constantly using your data to kind of evolve the experience.
And then when we went live with the previous slide with getting people handed over to the chat app into the app experience.
We’ve worked out about, one in four calls are now being diverted and then customers are self-serving themselves.
And they love that experience.
So, as a business, we’re helping them double down and to see what other features can we add into the live chat sort of experience.
Whether it’s still talking to a human or adding other features, more personalization. So when you’re inside the app now you can see your points balance.
People when they land, they want to see their points balance and their status credits immediately. That was like one of the biggest requests and they would land and sometimes call the Qantas hotline and go, “Am I gold now, am I gold now?” And, we put that into the ecosystem, into the app and that frees up the human agents for more complex queries.
That’s it, thank you again.
I wanted to keep it short and sweet.
I know it’s a long two days, thank you.