Designing responsibility during rapid technological change.
(upbeat music) - Today I wanna share with you a personal story that led me to think about the responsibilities we hold as designers and technologists in the face of the rapid shifts driven by technology in society we see today.
Four years ago, I received a text message that informed me that I was one of over half a million blood donors whose personal information was compromised by the Australian Red Cross Blood Service.
A backup of the appointment booking database had been saved to a public facing server and was exposed for up to three months.
This was a very personal information of people who are literally willing to give their own blood to help others.
So I thought back to the questions I'd answered when I'd filled out the form to donate blood. I thought about the usual questions that exclude me from donating.
I'm a consultant and an avid traveller, so questions such as whether or not I had been travelling in the previous four months are usually ones that exclude me.
There were also questions of a much more personal nature, such as whether I had ever injected intravenous drugs. And other questions about the sexual behaviour of myself or my partner.
I thought about what that meant for not just me, but what that would mean for someone whose sexuality was not aligned with their religious community, their family or an authoritarian government from their home country. How much they made feeling today.
Do they wonder if information will come back to haunt them, will they ever be outed? What are the ramifications for them? And perhaps they already have been.
I also thought about the local design studio in Melbourne who had developed and designed the online booking service. This is the kind of agency that I had aspired to work for as a design graduate.
And I started to think about what that team must be thinking today and what they wish they had done better.
Other high profile incidents of technology coming back to bite us scattered throughout the media today.
Data breaches, technology contributing to unfair working conditions through the gig economy.
Social polarisation, bullying harassment is now becoming commonplace. Assaults are taking place via dating apps.
Technology today has even become a threat to global democracy.
There's no wonder we're starting to see the tech lash defining the technology market over the past few years.
So after this data breach the question I started to examine along with people like Kat Zhou were for those of us who have the privilege of creating products, how can we take more responsibility for defining their ethical effects, forecasting ethical effects and ensuring they pose no significant harm. COVID-19 has pushed more and more consumers online just to go about their day to day activities, which makes our responsibility even greater than before. So today I wanna share with you three things that I think that we can do better as technologists and designers.
First, I wanna go back to when I studied graphic design. We were trying to be creative on expression, form, function, layout, typography and concept generation. And we would try and to always listen and respond primarily to our clients.
But since then, my career as a designer has naturally evolved along with the shifts in the market. From subjective creativity to problem solving and human centred design practises in software. As many of us, the transition from designing brands and brochures in campaigns and moving into an ever increasing complex technology environment has meant we've had to change our practises and think about fall wider considerations.
So when designing, we are often met with a blank sheet of paper. Our job is to envision what could be and what change will occur.
In thinking about the impacts of our design decisions, I'm going to use the definition of design from veteran user experience designer Jared Spool. He describes design as the rendering of intention. As designers we help to discover, define and articulate the intention behind the technology that we're creating. This might be the target audience, user needs and abilities, what brand we're trying to express, the features we need to adopt, the aesthetic and also define how we think we're gonna make money .
With a lot of hard work and dedication that intention is rendered into the technology that we create.
And we hope that it's useful, usable, valuable, delightful, beautiful and also profitable.
What we also starting to see is this unintended consequences.
These might be discrimination inaccessibility, vulnerabilities, misinformation, exploitation. So what I think we need to start doing is actually think about the values that we want inherent in our technology in that intention phase and think about how we're gonna mitigate against those unintended consequences. So the first thing that I think that we can do better as designers and technologists is to examine and evolve our methods.
I believe it's our responsibility to evolve our methods and take responsibility for the technology and the software products that we design throughout their lifecycle.
It's not just about articulating our intent, it's also about observing and owning the consequences. Luckily, we don't have to re-invent the wheel, there's already a movement of organisations, designers and technologists developing methods for us to use.
EthicalOS helps teams to examine eight potential risks zones throughout their product development lifecycle. Tarot Cards of Tech is another example of beautifully designed prompts that open up perspectives and points of views in our teams.
They pair really well with creating how might we statements in the ideation stage.
And these sensible security conversation cards are a great way to workshop and facilitate a conversation with your team about security and come up with an action plan to mitigate risk of security incidents and data breaches.
So I thought of ways of starting to look at these methods and to curate a list of ones that we can use in developing technology more responsibly.
But it's up to all of us to try and adopt these methods, evolve them, make them better, incorporate them into everything that we do. Besides deficiencies in protecting people's privacy and security, We now know that inherent bias also causes adverse effects for many in the world who rely on technology today.
Bias sits not only with the people who are creating software, but also the data that they use to make decisions. With certain examples of systems that predict criminality based on race.
Google's own voice recognition software, for example, was 70% more likely to understand a man than a woman when it was first released.
In a book called "Invisible women," Caroline Criado Perez outlines examples of transportation systems, medical devices, smartphones and other technologies that we use every day that are designed with less consideration of women than men. And it's no wonder when today make up just 11% of software developers globally or 25% of Silicon Valley employees.
It's a pretty saddling statistic given the first computer programmers were women who emerged during the second world war.
So to make better decisions and to have more chance of thinking about unintended consequences, we need to think not only about how we design, but who we designing with.
We need to be far more inclusive and collaborative. So the second point I wanna talk to you today is that we need to be more inclusive of far more than just gender, but in expertise, ethical background, perspective, age, education and experience.
Rather than rely on the opinion of a single designer, we need to work closely with our team members, users, stakeholders, communities and be highly collaborative with data engineers, legal and cyber security experts. Diverse perspectives in technology, enable us to crate better and safer products that take everyone into consideration.
Not just one section of our community, such as hoodie wearing tech Bryce.
I'm glad to say that we are seeing change in industry around diversity.
Many organisations to introducing policies and programmes to increase diversity in teams and in leadership. And I'm glad to say that I worked for a company that is now more than 50% non-male and has over 50% representation of non-men in our leadership teams.
We're also one of the few companies I know of in Australia that offers benefits such as transition leave for our transgender colleagues.
But there are still many organisations lagging behind. And I think for all of us, we need to look around with who we're working with. If our team thinks, looks and acts like we do, we probably have a diversity problem.
The rate of change in technology is enormous. Trying to predict the future news story that your product could create is really inherently difficult. Even with improved methods and a highly diverse team, history has shown us that humans and notoriously bad at predicting the future.
It's only natural that we have a bias towards imagining the ways our products can benefit the society and not focus as much on how we might harm it. And there are some effects that are just nearly impossible to foresee.
So the third thing that I think we can do is to experiment more, observe the impacts and then adapt as we learn.
And to demonstrate my point, I wanna share with you quite an extreme example of how experimenting with emerging technology can help us to predict some of the unintended consequences we might encounter.
The Works Runs and arts programme without us on the cutting edge of technology and its interaction with society.
The director of the arts programme, Andy McWilliams describes how interest in art as emerging technology research.
By working with artists it helps us to look forward and to build a view of what's coming.
Neil Harbisson was born without the ability to perceive colour.
Neil has implanted an antenna with a webcam at the end of it into the back of his skull. It translates colours into various vibration that Neil perceives as sound.
And importantly, he cannot turn the device off, which means that his brain has adapted to the incoming stream of vibrations and data that he now perceives as an extra sense.
So Neil now describes himself as a cyborg.
This is technology that exists today.
And by exploring the ways that a data feed can become a new human sense, we're able to look ahead at possible futures to examine the benefits and the unintended consequences and to start creating a dialogue about it.
But luckily we don't have to become cyborgs to run experiments.
We can treat every design decision as an experiment to observe the outcomes and to have a conversation about what the implications might be.
Rather than design being a one-way interaction, we need to stop thinking about how it can be far more cyclical.
Where we're continuously observing the results and course correcting carefully to ensure that what we're creating does truly reflect our human values.
So in conclusion, it's clear that as a tech industry, we have far more work to do to humanise technology. If we don't have the ability to prevent even a simple unintended consequence such as a data breach, we have a long way to go.
I hope these small strategies will motivate you to examine what you can do to take more responsibility of the consequences of your work.
And I hope that they help you reflect both your values and the values of your team.
(upbeat music) Thank you.
(upbeat music)