(upbeat music) - Thanks everybody for your time today.
I've done a little bit of a switcheroo.
The original title of this talk was Minimising Bias To Create A Well Rounded Product. Actually it's the same artist and the same topic and different title so it's a little bit like Prince, formally known as. So this talk is going to highlight how research can be really powerful but how it can also set you astray.
Okay, so just a quick introduction about me and thanks for the introduction as well.
So I'm Anna Lee Anda, I'm a senior user experience researcher at Zendesk. I have almost a decade of experience watching teams go astray when they've, due to a mistake in interpreting data.
Presently I work at Zendesk.
So if you haven't heard of Zendesk before, we're a customer experience company and we work with teams like support sales and engineering to improve the relationship between brands and their customers.
So bias, it's something that can cost a lot, it can be really demoralising, and so today I'm going to be talking about a couple of examples in the industry, some you may have heard before, and I'm also going to be talking about some Zendesk-specific examples and then I'll be talking about four phases in which you can minimise the bias in your work. So the cost of bias on a large scale.
I'm gonna be firstly talking about Kodak versus Fujifilm.
So one adjusted along with the times and the other one didn't and is now defunct, I would say.
So firstly I'll start with Kodak.
So Kodak's main problem was myopia.
Essentially what they were doing was they were really focusing on creating a digital camera which had parity with their film camera experience. And actually over time we could retrospectively look and say that the original motivation for taking digital photography was more about the sharing.
So it was less about the picture quality in itself. And even Kodak had explored digital photography. They had a prototype in 1975 and made varies investments, but unfortunately they were still so focused on creating this ideal experience that they missed out on the opportunity.
And then, if we counter this with Fuji, Fuji used to be second to Kodak and now they're a leader in digital photography. And they were very purposeful in how they came to be the market leader today. One of the things is they explored many parts of and many products to support or adjacent to their digital, their film category.
So they looked at things like creating products for magnetic tape optics, video tape copiers, office automation, and they even did a joint venture with Xerox. So we could see the difference between one company, which really focused on one thing only, and another which really looked at all the opportunities and the different problems that they could solve. And let me tell you a story about Melinda Gates and her agriculture work with one of the world's largest private charitable organisations. So Melinda started off on this project.
What they were looking at was improving agriculture. So it was a $360 million project which they focused on only one type of user, in this case it was farmers, male farmers, instead of looking at all different types of farmers. So they had to make a quick pivot in the project. So a little bit about Melinda, if you don't know, she's a philanthropist now but she was a former GM at Microsoft and in 2000 she founded the Bill and Melinda Gates Foundation.
And one of the reasons why they focused on agriculture was that 75% of the world's poorest people actually live in rural areas and so they rely heavily on agriculture.
So they thought that this is a good problem to help with it.
But originally when the project kicked off and you asked Melinda to describe who she thought a farmer was, she would say that the farmer was male.
But what happened was, as she did more research she realised that there are many different types of farmers and the problem wasn't as simple as she originally thought. So after reading this report, she realised that they needed to spend time understanding the female farmer as well.
And what they found was that it wasn't that men were necessarily better at farming, it was about equal access to good land, to seeds, animals, help, tools, time, and know-how.
And some of the countries also prevent women inheriting land.
So the agriculture research in the past had focused on high margin crops which were typically farmed by men, and so there was less innovation and investment and low yield, or low margin crops.
And these were the typical ones farmed by women. So what did they do in this project? Well, apart from spending time understanding both male and female farmers, what the Bill and Melinda Gates Foundation did was they looked at implementing this thing called Farm Radio International.
So what it was was a radio programme that taught women best practises for growing tomatoes. And so they had actually done some research which had led to this to figure out what time women tended to listen to the radio and also recognise when is a good opportunity. So for example, when where men were home they might listen to something else, so they looked at when to air these programmes. And so the overall thing that they were trying to do is make sure that women would get the information they need to help them with the farming.
And this doesn't only happen on a big scale. This even can happen at our work at Zendesk. So what happens is that we have to make sure that we actively talk to the people who buy our product, but also we have to talk to the consumers of our product as well.
The buyer may use Zendesk for an hour or two, whereas a customer support agent, that's their whole day inside our Zendesk products. And then the other thing that we have to be conscious of is that culture and location have an impact on the use of our product as well.
So an example is that we went to a BPO, which is a business process outsourcer out in Manila, and we got a completely different impression of our usage and insights compared to our consumers or users in the United States.
So now I'm going to be talking about the four areas to minimise bias and how you can take these away and apply them in your work.
The key thing I would say, is to look at the problem with different angles. One of the things I mentioned before in our pre chat was avoiding anchoring bias.
So really don't rely on that first piece of information you hear about.
Really take the time to look at the problem space in different ways.
So these are the four phases that I'll be talking about. I'll be talking about where bias can happen in research recruitment, in interviewing, in surveying, and finally in the analysis phase.
So let's start with research recruitment.
There are three things or three areas to take note of when you're recruiting for research.
The first is finding the right people.
And as I mentioned before, as a B2B company it's all too tempting to go for the loudest customer or customers that are very easy to access.
The danger is that you may have your product direction go astray and you may actually be building something for a few rather than your whole customer base. So this is particularly pertinent when you're looking at building a new product versus a feature in an existing product.
You want to try to balance, as I mentioned before, speaking to the person who's actually buying it and if it's different to the person who's actually using it. I would say that compared to the consumer industry, or B2C, that the risk is slightly lower.
And then consider the opposite group.
And what I mean by this is you may have a particular idea in mind about who this product or feature is for, but it's always good to look at the opposite group. And I'll give you an example of this.
One of the features that I was part of the team in building at is a feature that was originally for our enterprise customer, so larger scale.
It's called chat routing and essentially it helps to make sure that the right chat goes to the right person and because it's live it's very critical.
And originally we were only wanting to talk to the users of this feature.
We had released it as a beta and we only wanted to talk to those people. But actually what we found was we took the time to talk to the people who had tried it and then switched it off, and what we found was a whole lot of reasons and opportunities for us to improve this feature. So it wasn't only about talking to the people who were using it, non-users are as important as well and insightful, I would say.
And to the third point, which is similar to the second point, but I would say that compare different groups. And I'll give you another example of this.
So, again, I was working on a product at Zendesk, or project with our support product and the integration with our chat product.
And what that means is that, again, originally we were thinking that this would be for only our larger customer and what we did was actually we took the time to talk to smaller customers as well.
So enterprise, and then SMB.
And again we were finding out lots of different things and reasons and equally they were as important user group as our large enterprise customers as well.
So find the right people, consider the opposite group and compare different groups.
Okay, the interviewing phase.
Two things I want to really emphasis.
One is don't lead the witness.
So when you're talking to your users or potential users, the reason why you're doing that is because you really want to find out what they need and understand, rather than have them confirm what you want to know or to make you feel good about your product. So my tip there is try and use open questions. So what I mean by that is you could use things like, tell me, explain to me, describe to me, and the acronym is TED, and that really helps open up those questions so that you're not leading them and you're getting a very rich interview.
And then I'm sure you hear this a lot in the changing room, does my butt look big in this? So the problem with this question is it is focusing on a particular pair of pants that you may be trying or skirt.
In this example, actually, you're missing out on finding out what makes you look good. Instead you're focusing on this particular pair of pants or solution. And what I really want to emphasise is that I think with this question its not only leading but I think you're always going to get a bad answer. The person who asks the other, the person giving the answer is always going to feel really uncomfortable and maybe the person who's asking it also may not appreciate the answer.
So it's kind of a lose-lose situation.
So think about rephrasing and what you're trying to achieve. In this case you want to look good, so the question should be what do I look good in? Rather than does my butt look big in this? And then another one is make sure your survey is written correctly. So often you're writing a survey and you're sending them out to your users and they cannot interpret what you're trying to find out. You have to make is so explicit that it can stand on its own.
There's a couple problems with this question. I think from the audience reaction I think they can tell.
One is it's biassed because you're asking people to tell you how much they liked the product and they may not like the product, so it's already putting a particular position or frame of mind.
And then the second issue is there's only one response, so basically you can only tell them how much, that you really like it.
So think about how you can have these open questions that really capture what you're trying to find out when you're sending out a survey.
And then the analysis phase.
So three points that I want to share about the analysis phase, always consider other data sources.
So beyond that main way that you're collecting data, Think about things like churn data, voice of the customer, what your customer support team are saying or hearing, sales and success, any feedback from a forum, for example.
So look at all the data and triangulate it and look at whether it's giving you that big picture or not. And similar to the research recruitment I mentioned before, compare different groups of user data.
Again, you want to triangulate and get the big picture.
You want to see how one user group compares to the other. I feel like in this case it's helpful and more is a little bit better.
And then the third point is, don't confirm what you think you already know. So don't look for the data to answer one particular thing. Have it build you that bigger picture.
And I think it's a combination of looking at the data and having it support what you're looking at but also looking at other opportunities out there based on what you're seeing with that information. And then a bonus one, I would say, is let others review your work.
So really rely on your team and extra points if they want you to be wrong, because they're probably looking for something that you've missed.
So what I mean by, here is an example is, this is our team out of Singapore.
We have a combination of engineers, a product manager, designer and researcher all participating in this, and this is what we call our read out session and I'm happy to talk more about that after, but essentially by doing something like this, we are all taking what we heard from an interview and recording it down.
It means that there's no bias in what we're hearing and it makes sure that we're very careful about interpreting what we're hearing.
And also there are added bonuses as well.
Everyone is involved, it's really a cross functional effort.
The knowledge is gained on the go, and also people get visibility about what you're learning and hearing.
So this is a great final stage as part of your analysis.
And one more thing I forgot to mention actually, this captures and records that data so you can have it in perpetuity and you can actually, if done well, use it for your other projects as well.
So I wanted to go through and just recap the key take aways. Find the right people, make sure that you're not focusing on one small subset, but looking at different groups, maybe the opposite, maybe another large versus small.
Look at a combination of your users.
Don't lead the witness.
Make sure that they're there to, and you're there to listen to what they have to say, rather than them confirming what you think you want to know. Make sure the survey is written correctly.
Start with the right questions, especially as I mentioned before, if you're in an enterprise, you may not have the chance to keep emailing your users over and over again, it might be a wasted opportunity.
So make sure it's written correctly.
Don't confirm what you think you already know, otherwise the work may not be as valuable and you're missing out on opportunities.
Look at the data in different ways, triangulate it and see what that data has to offer. And then the bonus, let your team review your work, especially if they want you to be wrong.
And with that you can also have them review things like your interviews, you could pilot with them, and you can also have them test or try out your survey. So they all go together.
Thank you for your time.
(crowd applauds) (upbeat music)