There are many examples of how invisibility in society is increasing. More and more is happening behind the scenes: data generation, algorithmic computation, and decision-making. It’s becoming harder to grasp the wider effects of individual and corporate actions based on things that are invisible to us. At the same time, the level of trust in society is decreasing.
Fake-news and scandals of companies that misuse data hits us every day.
What is fake and what is true? And who should we trust?
Because trust is essential: Human interactions, relationships and systems are all based on trust.
Every day, we put our trust into people, organisations and services.
It’s fundamental to our communication – a vital currency.
Fictional entities like brands, communities, democracy and trade only exists if we all trust them.
We need to design for trust.
This session explores how we can design for trust in the things that we can’t see. Trusting Invisibility!
— Alex ✨ Webdirections Summit week (@skougs) October 31, 2019
Katja is talking about design ethics and trust… how designers use the incredible power and opportunity they have.
We are in a trust crisis.
When we are vulnerable we let people have power over us, because we trust they will do the right thing. But what we currently see in industry and politics is that our vulnerability is used and abused.
As designers, we are not having enough conversations “just because we can, doesn’t mean we should”. People may not be asking the question, people may not even know there is a question to be asked.
Examples of companies that have abused our trust:
- Facebook → took our data → gave it to Cambridge Analytica → swung an election. The terms and conditions allow it, but a bunch of designers have been involved in CA and other scandals.
- VW → dieselgate. This was a massive breach of trust from a company promoting itself as a green option.
- Boeing → two planes had crashed, but it took a presidential order to ground the planes. They had known about the problem for years.
We as designers have a responsibility to push back.
Then there’s Trump. “Fake news” is not a new thing. It’s been a thing throughout history. But what is new is robots creating new networks of fake news and pumping it out at an incredibly high rate. So fast that it will take an AI to detect this kind of misinformation campaign and counter it.
Example: The Spinner, which allows you to pay $49 to present information to people and manipulating them subliminally. Packages like “get your wife to initiate sex more often”, “get your parents to buy you a pet”, “get back together with your ex”. Gaslighting As A Service.
Anything we design will face questions of trustworthiness.
Yet more and more of what we design will be invisible. Recommendations for content, algorithmic content, these things are invisible – who’s making the decisions on how they work? Developers. Their job and training is to create great code, not necessarily the human design side, or the ethical side.
Invisibility is increasing.
- 2.5 quintillion bytes of data created every day
- 200b smart devices by 2020
- 85% of customer interactions will be managed by machines by 2020
Trust is decreasing. Edelman trust barometer:
- Trust declines year on year in 10/15 sectors
- In 20/28 countries there is a general distrust in institutions
- US experiences biggest drop from 52% to 43% drop
2 500 000 000 000 000 000 bytes of data are created every day.
By 2020, 85% of customer interactions will be managed by machines. By 2025, that number will be 95% pic.twitter.com/8k7TXPi8qK
— Ivy Hornibrook (@ivyhornibrook) October 31, 2019
Who do you trust any more?
A quick experiment at Designit showed a non-technologist could create a credible deep fake video in half an hour. What does that do to trust?
All of our human interactions are based on trust. If we can’t trust each other, society can’t function.
We are the only species that trusts. Theory of Mind: parts of our brain let us do an amazing trick – we can project ourselves into someone else’s mind. Oxytocin, dopamine and empathy allow and encourage us to share peoples emotions.
Trust can be designed for… but how?
There are three basic ways to decide if you will act in trust:
- Risk – you don’t entirely trust it but you must choose, so you accept risk
- Awareness – you are aware of the risks, and you choose to proceed
- Trust – you can proceed with confidence. You know what will happen and you trust the parties involved.
Most services stop at risk or awareness – here are the T&Cs, here’s the risk – take it or leave it. People don’t use those services out of a sense of trust. You need a social contract to enable a trust leap.
Interfaces are disappearing and the decision making systems are hidden away in backends. How can you trust something you can’t see or even understand?
— Ivy Hornibrook (@ivyhornibrook) October 31, 2019
Driverless cars have a bad rap after some accidents. Trying to regain and build ways to trust an inanimate object like a car is difficult. A prototype has been created with big cartoon eyes that “look at” pedestrians, simulating eye contact to demonstrate and reassure that the car has spotted them.
Looking at the history of trust, we’ve moved from small communities where you could know everyone; to very large cities where you cannot possibly know everyone. That means we moved from individual trust to systemic trust. We don’t just place trust in individuals, we also trust institutions. We trust our banks, governments…aged care facilities? Do we truly trust these institutions?
We have a new trust paradigm. It’s an exciting time in history and for designers within that. Designers are responsible for designing how things work. Trust has been eroded, younger generations are more suspicous and less likely to trust, but new technologies are coming like blockchain which can codify transactions – codify the basis of some forms of trust. There are some great opportunities for using these new technologies, in some surprising sectors like provenance of food (avoiding counterfeit food is a significant problem).
This changes the shape of trust networks: from centralised, to platform-facilitated, to fully distributed.
Distributed trust is a new paradigm.
- (telecomms company) Growing and developing with fun – it was a legitimate use of gamification! Using gaming interactions to build up a profile that let people trust that someone else was an expert in a certain area.
- (aged care facility) project to engage the community to spend time with the elderly – it created a buddy system. It was valuable for people to meet someone new, and valuable for the elderly who often battle lonliness.
- You can only really design trustworthiness, not trust. Show that you are worthy and people can choose to trust you.
- You can’t control your brand, peoples perceptions or trust in your brand.
- Just because someone engages with you, doesn’t mean they trust you. They may not have options, they have other priorities, or they may just be assuming the risk.
- Trust does not equal transparency. Trust is still a leap into the unknown.
- Transparency doesn’t mean clarity – would the average person understand the deep details of how an algorithm really works?
The relationship between trust and transparency is a confidence threshold, where the balance tips and people feel confident.
Trust is not absolute. We talk about trust in an absolute sense – we do or don’t trust someone – but when you dig further you find out why they do or don’t trust. As design practitioners this should be familiar and encouraging. The why is more revealing and useful than the absolute.
You can break trust down into more nuanced areas:
- ability – competence, are they capable of doing the job
- benevolence – is this intity’s intention good? do they mean well?
- integrity – are they being honest? are they upholding promises and telling the truth?
There are things you can include in your interview script to explore these areas.
- takes time to build
- is very quick and easy to lose
- will constantly be questioned
- needs to be reinforced
- partner with a company that has a good trust basis – eg. Volvo, everyone sees them as safe. This works provided you are not deceiving people.
- authority – eg. B Corps, establishing expertise
Manipulation is not our purpose in designing for trust. Do not take this an use it for evil. Do not engage in trustwashing.
Take on the human perspective when shaping futures. If you are working on machine learning or AI projects, you are at the forefront of this issue.
What matters tomorrow is designed by us today. The runway is very short. You need to have your engine started now.