On Mobile, Context is King

(upbeat music) - So I'm gonna talk about mobile on context today. First of all, what is context? Well of course it is the definition of what context is, it's those things around us that define either the things that we do or the things we can do.

And now, after obviously asking you to stand up and stay awake I'd actually like you to close your eyes.

So while you're closing your eyes, think about the context that you're currently in. Where are you sitting now? In your chair in a room in a building in red Fern in Sydney.

The time is 3:15 and the date is the 10th of November. There's been events that have led you here today and there's things that you got planned for the coming days. Now open your eyes.

So thinking about context is not new.

We might not be aware of it and sometimes it takes a moment of reflection to think about the things that are going on around us and why we are at a particular point in time or place we are at that time or where we're going. But in terms of mobile, we've certainly been thinking about it for a long time. Jen and myself and a number of other people in this room have been in mobile now for you know 15, 17 years and even 10 years ago, one of the key things of mobile was context.

That we were now taking these computers out and about into the environment and therefore it changed the way of the things that we were doing and therefore the things around us had much more of an influence compared to a keyboard and a desktop display.

So 10 years ago, had BICI in Singapore, Jarod Brodeman and the gang from Giant Ant produced this beautiful visualisation of a number of aspects of context.

So if we've been thinking about it for a while, why are we talking about it now? And of course that's because mobile has changed. The nature of mobile is changed significantly over the last 10 years.

The capabilities of the device, the services that now feed into those devices are so broad and such a wide range and have been imbued with a lot more capability in and of themselves, obviously Cas was just talking about context in terms of AI and the definitions around those and things like that.

And I think the key point, certainly from a design perspective is that context can add simplicity to interactions and create richness in experiences.

It's this ability to enhance.

And I think certainly most of us we've got mobile services now, we've got the standard responsive site and we've got an app out there, whatever it might be. But we might not be always leveraging the power of mobile and in the past, that could be quite tricky. It's this idea of how do you design for the individual? And it's always been complex, but now we're get into things like machine learning where we can create unique experiences or service based experiences for individuals. We still need to have a framework to design those experiences but they can be relatively unique.

Of course three components of context is that past. What have we done? What have we been doing? What does the system maybe know about us? How can it then leverage that for the future? Of course there's the now.

So we've just talked about.

What's the time, what's the date, where are we in location? And then the future.

And the future isn't always based on the past. We talked about modelling a minute ago, but also this opportunity or the data that we have on our mobile sometimes looks at what is coming out.

So a calendar is probably the most easy example of that. What are we gonna be doing and when are we going to be there? Obviously context or understanding context when we're designing for context requires customer research.

And I hope we all are in agreement around that. I was sat in the corner quick check, kiosks at the airports, and then in the corner's business lounges I've sat in the back of cars down with Holden going around to close track to watch how people interact with in car systems and how they manage driving at the same time. I always hung out in retail stores and looked at how people use their mobile device to snap photos of the item that they might want to purchase or to use online reference service or eBay, or whatever it might be to look at the price and things like that.

And I've sat in homes and watch people, with their permission of course and watch them go through that installation process of a broadband or a service or whatever it might be. So there's no doubt that to design for context we need to understand what the customer's experience is. So today, let's look at a few examples.

And I really just wanna focus on three core aspects of context today.

The first one is, the contextual information that we get from the device itself.

Okay, so the mobile in it as an object.

The second one that I think where some really interesting stuff's happening is the context of the mobile device to other smart devices. Internet of things, smart home, whatever it might be. And the third one we will look at today is the mobile to environment.

So you know, what are we getting from our surrounding environment, where we are and when it is.

So let's start with mobile.

And there's some really easy examples of course when you're a first party, or on your iPhone, it knows your calendar.

Therefore when I open my maps app up, it knows where I'm supposed to be.

It knows where I'm going next, okay.

That's a really easy one, and again it's something that wasn't happening 10 years ago. For those of you who remember back to the D patterns on, we used to have a Google maps app that we'd have to go between our contacts app, get their actual address of the person that we wanted to go and see and then switch over to our maps app and type that address in.

Now we just tap on things or in this case it does it for us.

It prepares us for our next activity.

And again it's simplifying that interaction or that experience.

Not necessarily make it richer but simplifying it. Now that's first party, that's Apple on the iPhone. A couple of months ago, and I think it was back in June from memory is that Uber offered the ability to integrate with your calendar.

Has anyone seen this or use this? It's very similar, Jen, there you go.

It's very similar, but by a third-party service of course. So you're joining two things that make the whole experience more powerful. So now when I go to catch an Uber and I which lift was obviously here in Australia it can show me the things, again, from my calendar of where I'm expecting to go.

So I don't have to type that address in.

Of course it's got your safe places and the places you might have been recently. But it now can also have the places that you're intending to go.

And that sounds really simple, but again it's just reducing that friction and making it a better experience.

Now during this talk, I wanna show you some interesting examples but I also wanna show you some examples where I think it's gone a bit wrong.

And I better check, is anyone from Qantas here? Hey, there we go, hi.

Say hi to the dean that we're reckon for.

This is an example where I don't think it's quite right. So we got the email the other day and how do I join my Qantas account and my Uber account. And the steps you can see down the bottom there. Open the latest version of the Qantas app, tap the book icon, tap the Uber and you'll be taken to the Uber app.

Now that doesn't feel as seamless.

When I'm going to catch an Uber to the airport, do I want to think about the other app that I have to open to then come back to the Uber experience? Maybe not, but I won't dwell on that.

This is a project that we've been working on recently with SANE who's a mental health network and it's for people with bipolar and we're currently going through a preclinical trial and looking at how people with bipolar, how we might be able to, or the phone I should say, might be able to provide people with bipolar with insights to their mobile phones. So one of the key aspects of managing bipolar is apparently to look at things like sleep. So we ask those people when would you like to be asleep? Now at the moment, we don't have the ability to tell whether they're asleep, but we do have the ability to tell if they're not asleep if they're using their phone.

And we're looking at general usage of various applications. So communication applications, games and things like that. The frequency and the volume of those applications to look at how that might either align with an onset of mania or similar.

So this is a tool for people with bipolar that have the ability to share that information with others so that their carers or their family or their friends might be able to see activities or certain patterns of behaviour that would lead to this. And again it's under their control, and one of the key things that we wanna know, well certainly SANE wants to know is is this an app, if I have bipolar, do I get frustrated with this because it is giving insight that maybe I don't wanna see and do I just delete the app at all? So we're are very very early on in this process but you can imagine also later on if this goes forward, that we might look at other aspects as contexts. So not only how they use their mobile phone but information from potentially wearable or other information from smart home et cetera, how long they have been sleeping or when they have been sleeping, and so on and so forth. So that leads to the mobile and its relation to smart devices.

I play around with this stuff much to my wife's dismay sometimes and this is when we've got a call canary and I had it in the office first of all to play with it but everyone got a bit annoyed, so I took it home.

And it can understand whether I'm at home because obviously my mobile is usually with me or whether my wife's at home and of course when neither of us are at home it changes its mode to monitor the environment and to observe it.

There is a condition, and you can see the third one down in the middle screen that we might both be home but we might be in the bedroom asleep, but we still want that area monitored, okay. So you turn it to night mode, saying, the context is we're both here but we do want it to monitor and then obviously in the timeline it tells you when I arrive home and when I depart home and Amy as well.

I think there are times when it can go wrong when technology does create a bit more of a barrier than is required.

This one was put up the other day by one bright light. Needing to download the app to go to the toilet and I think this probably frictionless experience. Here's another project that we've been working on a great little Australian start app called Go far and it's an in car experience.

So you buy the kit, the kit's got what we call the ray that sits on your dashboard and it has a light on it that you can see the blue light up the top. It then plugs into a dongle which then plugs into your OBD port which is on most cars after '96, '97 and essentially what happens is while you're driving along you get it nicely, you get a blue light if you press your accelerator too harshly, that light changes from blue to pink to purple to red. If you break too harshly, that changes in intensity. You get this real-time are visual feedback on how you're driving is.

Now of course it's not trying to say don't run over the dog.

If the dog runs out on the road 'cause you don't wanna negatively affect your points 'cause you get points on breaking and driving but the aim is to build up patterns of usage or to give you insight into that.

And we do provide context not only from the light but also you can use the app while you're driving, and we give you a simplified view of that.

So again we change the information based on context. But then you can see where you drove, where you tend to break badly or tend to accelerate in a different way as well. This is another one we did with accountable wearable experiments.

We've worked on a couple projects for the Super Bowl last year, mainly marketing sort of exercise, but the object itself was the white jersey what we call the firm jersey in the background there and it's got a haptic ring around the neck. So a flat sort of rubber ring.

And the aim was to look at the role of a wearable, when you're actually the game.

So you might be watching it on TV or you might be actually at the game itself. So how can we enhance an experience of somebody in that context with information that's coming in via the mobile device? Now we worked on the app, but we don't expect people to look at the app during the game.

You've got a big score board up there that tells you the score.

And you've got a better sense of what's going on in the game so how can we help you relate to what's going on on the field? They did a version of this earlier as well with the AFL. And the aim is that you can pick your favourite team of course, 'cause you wanna follow your team. And then you might actually, with the AFL one, what they were able to do was pick their favourite player.

And then they'd be able to get their favourite players heart beat.

So as their favourite player engages with the ball or whatever it might be, they would get a feeling of what that player was going through.

And obviously you can use a whole range of data inputs. How do we engage people within that context in a very different way to make them feel more part of the game potentially. I was at Red High in Denver this year in May and this is one of the sensors that was being shown off, and really showcased. Now it's a really interesting one.

I'm fascinated by all the different sensors we have, smart phone et cetera, et cetera.

Smart home and IT et cetera.

This was one that you could actually put in your home. You put a single one or per room et cetera. And you essentially train it.

So you would put the sensor there and connect it to your device and then you can turn on the vacuum cleaner. And then it notices a change in the electrical signals and a whole range of inputs and then you turn it off and then it understands that that is the vacuum cleaner. So this one sensor can monitor everything if it is trained over a period of time to learn what those devices are.

So you don't need to build a sensor into all those objects to tell you when they're on or off.

You can have one sensor that understands, hopefully, all of them.

And then lastly mobile to environment.

And again, I tend to try and make myself a bit of a guinea pig around this stuff so for probably three and a half years now I've tracked all my movement around using an app called Moves, and it's pretty much failing now since Facebook bought it.

It sort of really fallen over.

But this is one of the services, it visualises the information from Moves, called movescope.

And you can see, you can pick a place.

So our offices in Surrey Hills so you can see I've been there 469 times.

You can see the days of the week that I tend to be in there. And you can see the time that I'm generally in the office as well and where I've come from.

So usually have come from home.

And then usually I go to home.

But you can see the different paths that you've taken and so on.

And again, this is information that could potentially feed into a system and location is obviously a key aspect of the context. Is Camie in the room? Hey Camie.

This is an interesting experience.

And it's not, we did some work with IAG and this is not IAG's fault because it's the service, the signing service that we use it.

But every time I walk up to the building my watch beeps and it says would you like to check in to the building using the services they happen to use? And of course Jones and other people use it around, which is great because it makes a really easy I just hit that and I'm good to go.

However when I go the Apple Store, when ago the Optus store that's just down the road, when I go to a research facility around the corner it still thinks that I'm in that zone and wanna go to IAG. So it pops up on my watch a lot.

So for those of you who were in Rob Manson's talk that location aspect right, isn't quite tight enough yet that we can really define it down, but Rob is saying that it is coming.

And so that's highly beneficial if I'm up in the lobby of the building or just outside the lobby of the building but not when I'm wandering down George Street. Is John in the room? No, okay.

So one of the things, and if you've ever seen Derrick Federston talk he's sort of a leveraged his concept for this. But one of the key aspects of context is time. So we've come along.

It's the end of the second day, thank you for persisting. And you wanna check out what talk is next of course or when the break is and when I'm gonna finish up. But if you're on your mobile you've gotta scroll down that far to see what's happening. What don't we use time, so that when we are at a conference the device knows what time it is, why don't we just ratchet down to that aspect of the schedule.

Just make that little tweak to the customer experience and make it easier.

Now there are cases where people might want to see what happened earlier in the day, but I would suggest that most people are looking at the next thing that's happening.

Context of course now we've got the iPhone 10 and Wolby Parker put out an app the other day that will do a face scan and then recommend glasses for you. So and interestingly, Wolby Parker have got the AR thing where it shows you the glasses.

They haven't quite connected the two experiences but obviously the aim would be that you would scan your face and then you would see the glasses an AR experience of the glasses on you.

But that's gonna be fascinating to see where that whole space goes.

And of course we've seen a number of talks at the conference about AR.

So Ika and Rob both talked about that and again it's this other element of context right? I can see my potential in IKEA lounge in my room and that extra information, that positioning of where it might go or will it actually fit, all of those things give me more context and inform my purchase. One of the projects that we are looking at and more from an AR perspective hollow ends of the thing is actually looking at how we can overlay very rich data sets and give information to people potentially like farmers about what's happening to their crop currently, what can they play around with in terms of looking at forecasting where the rain might fall and how things might change across the landscape and their environment.

And what if they change certain parameters on that? So what if they water that field? Or they water that field, how will that influence how things are happening over the next week? So again, giving you this context of the actual physical environment around you makes it much more meaningful to look at data in place where it should be.

So thank you for hopefully staying awake.

And the key thing that I want you to reflect on from today is how can you improve your services with just adding a bit of information about context and make those experiences either richer or simplify that interaction.

So my last pitch is, Mobile HCI is next year in Barcelona which is great if you're interested in mobile and you like Barcelona, and it's also great if you're interested in Barcelona and you like mobile.

So for those who might be interested on the industry chair so please approach me if you wanna do a talk there. Thank you for your time.

(upbeat music)