Designing Civility

For much of the history of the Web, conversations and other digital analogues of real world social interaction have been a key part of user engagement. From Usenet, IRC to AOL Chatrooms and now Github, Reddit, and Twitter, design patterns like threaded commenting, liking and favoriting, up (and down voting) lie at the heart of social media, online platforms like Stack Overflow (and GitHub) make up our communication tools. With digital literacy at an all time high, new communication models are starting to usurp SMS messages.

Digital interactions, so utterly commonplace, when given not much additional thought can seem overall benign. But in the era of the Arab Spring, the current US Election, Gamergate, and the Panama Papers, digital tools are the catalyst and common grounds of “robust discussions”. But do they really give rise to civil behavior, and benign outcomes? Or are they fundamentally problematic? Can we design our way to better, more healthy online behavior? Can empathy really be designed for web?

View image on Twitter


Xavier Ho ?? @Xavier_Ho

The important question of the day, then #direction16

1:56 PM – 10 Nov 2016

Twitter Ads info and privacy

The definition of Gamergate depends on who you ask: to those in the group, it’s political activism; to those who are not – particularly its victims – it’s a hate group. Caroline spent two years doing a deep ethnography on Gamergate and how Gamergaters communicate.

Fandoms, activism and online harassment create emotional spaces inside of infrastructure. In some senses, the nature of fandom is the same regardless of what it’s applied to. If a Bieber fan talks in a similar pattern to a Gamergater, how can harrassment be identified in a large system?

Twitter has UI issues trying to handle hate actions. A retweet can be used to incite a ‘dog pile’ attack where large numbers of people send a high volume of hate messages. The volume is too much to handle, all you can do is walk away. Twitter’s TOS don’t really cover this; and the interaction model for hate is the same as the interaction model for legitimate messages. Retweeting allows attackers to circumvent blocking: block the original poster, but you still get messages from their followers.

This isn’t a freedom of speech issue. This is a design problem.

There are significant technical issues blocking things via keyword across the planet (the Scunthorpe problem).

The flow of dealing with a photo on Facebook demonstrates why ‘feeling uncomfortable’ is a shorthand. People sometimes claim that ‘discomfort’ should not lead to ‘censorship’ and other arguments. Discomfort can mean deep issues around being and feeling safe in the real world.

The action is to request “remove photo”. Why? Because I feel uncomfortable.

  • I don’t like the photo
  • I’m afraid I’ll lose my job
  • I’m afraid I’ll upset my family
  • I’m afraid I’ll upset my peers
  • I feel unsafe.

Think of every post as an individual ecosystem. There are four broad types of post:

  • Town hall – open, public, uncontrolled. This speech is a town hall.
  • Front porch – it’s not entirely public, but it’s a little private.
  • Living room – it’s private, but not the most private.
  • Bedroom – the most private.

What if Twitter allowed users to choose distribution and privacy levels per post? This is not out of the question, many other systems do or have done this already (a contemporary example is Periscope, although many social networks have done this in the past).

Mockup: Twitter screen allowing filtering according to criteria such as age of the account, as that would stop the instantly-created troll accounts so popular with some groups.


Patima™ ??✨ @the_patima

So much to think about in this talk by @carolinesinders#Direction16

4:21 PM – 10 Nov 2016

Twitter Ads info and privacy

“I read the comments.”

How do newsrooms handle comments; and harrassment in the comments? Caroline worked with the Washington Post on their comments section.

Some things were simple – like keyword filters that could be tuned according to the author and what they faced. Comments could be set to require logins and other barriers to anonymous posts.

The internet is still wonderful. How do we keep the things we love while giving people a choice to remove things they shouldn’t have to deal with? Allow the cat gifs but block the trolls?

We need to think of design as not just a skill but a fluency to solve hard, complex problems. Design is political.