Designing responsibility during rapid technological change.

It’s no wonder we are starting to see a rising public backlash toward tech companies, or ‘techlash’, that is defining the state of the tech world in 2020. We are seeing data breaches, unfair working conditions, social polarisation, bullying and harassment on social media becoming the norm. Technology has even become a threat to global democracy.

It’s clear that there is an urgent need for designers to evolve our design practice in the face of these seismic technology shifts. So what can we do better?

This talk will give you three strategies to help you create better and safer products that take everyone into consideration and allow you to take more responsibility in creating technology that reflects our human values and intentions.

Designing responsibility during rapid technological change

Meg Blake – Product and Design Consultant: ThoughtWorks

Keywords: tech-lash, graphic design, human values, technological and sociological change, collaborative design.

TL;DR: Meg explores various aspects of the roles and responsibilities that designers and technologists hold in ensuring their values are explicitly integrated in the design process, particularly around informed consent, diversity, and inclusivity. She shares some personal examples from her own life and career as well as addressing broader examples of the ramifications when unintended consequences of our creations transpire, expanding on the conception of design as a ‘rendering of intention’ to highlight the importance of observing and owning the consequences of our designs throughout their lifecycle. She closes by highlighting a range of cutting-edge practices, resources, and tools which can help creatives to adapt and evolve their practices to be inherently collaborative and inclusive, and thus better equipped to co-evolve with rapidly changing technological and societal permutations.

Meg is going to share a personal story that led her to think about the responsibilities that designers and technologists in the face of rapid shifts driven by social and technological change.

Four years ago, Meg received a text informing her that she was one of a group of over half a million blood donors whose personal information was compromised by the Australian Red Cross Blood Service, due to a backup of the appointment booking database being saved to a public facing server and being exposed for up to three months.

This was very personal information of people who were literally willing to give their own blood to help others. Meg thought back to the questions she’d answered when first filling out the consent form, and of the usual questions that exclude her from donating blood.

Meg is a consultant and avid traveler, so questions such as: Have you traveled overseas in the prior 4 months before your donation? are typically questions that exclude her. More personal questions included: Have you ever injected intravenous drugs? as well as questions about her sexual behaviour. She thought about not only what that would mean for her, but for someone whose sexuality wasn’t aligned with their religious community, family, or authoritarian government. How might they be feeling today? Do they wonder if the information will come back to haunt them, or if they’ll be outed? What are the potential ramifications for them? Or perhaps there have already been some?

Meg also thought about the local design studio in Melbourne who had developed and designed the online booking service, which was exactly the kind of agency Meg had aspired to work for as a young design graduate.

What must that team be thinking today and what must they wish they had done better? And other profile incidents coming back to bite us are scattered through the media today. Data breaches, technology contributing to unfair working conditions through the gig economy. Social polarization, bullying, harassment, is now becoming commonplace. Assaults are taking place via dating apps. Technology today has even become a threat to global democracy.

It’s no wonder we are starting to see the tech-lash defining the technology market over the past few years. After this data breach, the question that Meg and others were pondering was:

Those who have the privilege of creating products have the responsibility of defining ethical effects, as well as forecasting effects and ensuring that they pose no significant harm. – Kat Zhou – Spotify

Covid-19 has pushed more and more consumers online just to go about day to day activity, which heightens our responsibilities even more.

When Meg was studying graphic design, students were trained to be creative on expression, form, function, layout, typology, and concept generation, and to always listen and respond primarily to our clients. Since then, her career has naturally evolved along with shifts in the market, from subjective creativity to problem solving and human centred design practices in software. The transition from designing brands and brochures and campaigns and moving into an ever increasingly complex tech environment has meant the need to change practices and think about much broader considerations.

In designing, often met with a blank sheet of paper. Our job is to envision what could be and what change will occur. Jared Spool describes design as: The rendering of intention. As designers, we help to discover, define, and articulate the intention behind the technology we’re creating. This might be:

  • Target Audience
  • User needs and abilities
  • Brand
  • Features
  • Aesthetic
  • Revenue
With hard work and dedication, that intention is rendered into the technology we create. We aspire for it to be:

  • Useful
  • Usable
  • Valuable
  • Desirable
  • Beautiful
  • Profitable

But what we are also starting to see are unintended consequencesdiscrimination, inaccessibility, vulnerablities, misinformation, exploitation.

We need to think about the values that we want to be inherent in our technology when we’re in the intention phase, and how we can mitigate against unintended consequences.

What we can do better: First, examine and evolve our design methods – take responsibility for the software and tech products that we design throughout their lifecycle This is not just about articulating intent, but about observing and owning the consequences. We don’t have to reinvent the wheel! There are a host of organizations, technologists and designers who have developed methods to help. Ex: Ethical OS model identifying 8 potential risk zones; Tarot Cards of Tech uses well designed prompts to open up perspectives and points of views in our teams (these pair well with creating “How might we” statements in the Ideation Phase); Sensible Security Conversations cards designed to help workshop and facilitate team conversations about security and how to come up action plans to mitigate risks of security incidents and data breaches.

ThoughtWorks is starting to look at these methods and curate a list. But it’s up to all of us to adopt, evolve, improve, and incorporate these methods into our work. Beside deficiencies in privacy protection, inherent bias also causes adverse effects. This bias is inherent not only in the people creating the software, but also in the data that they use to make decisions. Ex: systems that predict criminality based on race, such as Google’s voice recognition software which upon first release was 70% more likely to understand a man than a woman.

In Caroline Criado Perez’s book Invisible Women outlines examples, transportation systems, medical devices, smartphones and other everyday technologies that are designed with less consideration for women than men. She writes:

No one meant to deliberately exclude women. It’s just what may seem objective is actually highly male-biased. – Caroline Criado Perez

This is not surprising, considering women make up only 11% of software developers globally, and 25% of silicon valley employees. Startling statistic given that the first computer programmers were women.

To make better decisions and mitigate unintended consequences, we need to think not only about how we design but who we design with. We need to be more inclusive and collaborative.

2. Be more inclusive. Not just of gender, but of expertise, ethical background, perspective, age, education, and experience. Rather than going solo, work with team members, users, stakeholders and communities. Be highly collaborative with data engineers, legal, and cybersecurity experts.

Diverse perspectives enable us to create better, safer, inclusive products. We are seeing change around diversity across the tech industry. Many companies and organizations are implementing policies and programs specifically designed to address inequities. Meg’s company is 50% non male, including within leadership. ThoughtWorks also offers transition leave for transgender employees. However, many organizations lag in this regard. Take a look around – if your team think, looks, and acts like you, you probably have a diversity problem.

The rate of change in tech is enormous. Trying to predict the future news story that your product could create is inherently difficult. Even with evolved methods and a diverse team, history shows that humans are notoriously bad at predicting the future. It’s only natural that we have a bias towards the way our products could benefit society rather than the ways they could harm it.

We’re living in a world where cause and effect can only be understood with hindsight, if it can be understood at all. – Rebecca Parsons, CTO, ThoughtWorks

Experiment, observe, adapt. Let’s look at an extreme example of how experimenting with emerging tech can help predict and avoid unintended consequences. ThoughtWorks runs a program with artists on the cutting edge of technology and its interaction with society. Andy McWilliams, director of the program describes our interest in art as emerging technology research. Working with artists helps us to look forward and to build a view of what’s coming.

Neil Harbisson was born without the ability to perceive colour. Neil has implanted an antenna with a webcam at the end of it into the back of his skull which translates colours into various vibrations that Neil perceives as sound. He can’t turn this device off, meaning his brain has adapted to the incoming data to a point where he now perceives an extra sense. He now describes himself as a cyborg.

This is technology that exists today, and by exploring the ways that a data feed can become a new human sense we’re able to look ahead to explore possible futures, to examine the benefits and the possible unintended consequences and to start creating a dialogue about it.

Luckily, we don’t have to become cyborgs to run experiments. We can treat every design decision as an experiment – to observe the outcomes and to have a conversation about what the implications might be. Rather than design being a one way interaction we need to think about how it can be far more reciprocal, where we’re continuously observing the results and course correcting carefully to ensure that what we’re creating does truly reflect our human values.

To conclude: It’s clear that as a tech industry we have far more work to do to humanize technology. If we don’t have the ability to prevent a simple unintended such as a data breach, we have a long way to go. Meg hopes these small strategies will motivate you to examine what you can do to take more responsibility for the consequences of your work, and help you reflect both your values and the values of your team. Thankyou.