What is Abusability Testing and Why is it Necessary? | Hacker Noon

Author profile picture

@nchiNicole Chi

Co-founder of the Mobius Project. Interested in civic tech, the future of social, and fixing bad things Online.

What does the future of online social networks look like?  We have seen how the most powerful social media networks have endangered democracy and public health. While there are interesting conversations today about what decentralized social networks might bring to bear to our online social lives, I want to raise a new approach that is critical to any product team — particularly those building social features. 

Introducing abusability testing

To build better and safer, tech teams need abusability testing: a comprehensive process to test products’ vulnerability to being abused. 

Abusability testing draws from established approaches such as threat modeling and human-centered design. In addition to thinking about security vulnerabilities, it asks product teams to consider users and their human vulnerabilities to psychological, social, and physical harms. It extends human-centered design and value-sensitive design to prioritize vulnerable communities – people usually considered “edge cases.” Abusability testing challenges product teams to ask: “How might we be under-serving or even harming some segments of the population?”

Abusability testing is a standardized approach to combating platform abuse, a term that we (the team behind the Mobius Project) define as ways in which technology platforms can be used intentionally or unintentionally to harm individuals or society. Check out PlatformAbuse.org, our knowledge source of technological harms and mitigations to guide safer product development, to see examples of features mapped to abuses under our framework of abusability testing.

Learning and iterating

Consider Yik Yak, which was a hyper localized Twitter that allowed anonymous users to connect with people on a forum. Yik Yak was a promising new platform that required no profiles, no passwords — trying to make the internet a place where people could go to safely and privately connect with people around them. Yet, it was completely derailed within two years because of inadequate moderation of hate speech, active physical and sexual threats, and racist rhetoric.

Another good example is Strava, which has been repeatedly critiqued for a social feature that broadcasts users’ full names, photo, and running routes to strangers. This system puts people at higher risk for physical and financial harms. To date, Strava’s attempts to make their app safer have failed to actually address the abusability of their feature (read more about why here). 

Abusability testing helps product teams look at specific features that are shared among technologies across a wide range of industries (not just their own), and learn from how they have been abused before. This way, technologists can perform risk assessments and understand the consequences of what they are building.

Why is abusability testing necessary?

Consumer trust is affecting your user acquisition, engagement, and retention — probably more than you think. 88% of people say that the amount of data they share with a company depends on how much they trust it, and over ⅓ of people who have been harassed change their online behavior.

Psychological, social, and other harms facilitated by technology disproportionately affect women, people of color, and those who engage in political conversations — not a small minority of users. Successful product teams today also need to think about the vectors of harm that could make or break their business.

Abusability testing helps you determine: What are the things that I need to fix immediately? What are the mitigations that I’ll tackle later as my platform and audience grows? It can also help you differentiate your product in a crowded market, as trust is the most important brand asset you manage.

Ask us anything!

At Mobius Project, we hope to make it easier for you to work on platform abuse through our services. Our team has collective experience in privacy, security, civic tech, ethical design, and research, with a focus on the unintended consequences of technology on marginalized communities. Whether you have a trust and safety team or not, we can help you conduct abusability tests on your products and provide tangible mitigations so you can build a safer, better product. 

We are also gradually adding new features and abuses to our database to provide a public educational resource for all. 

Have a question about how your product might be vulnerable to platform abuse, or need help squashing bad actors? Send your questions to [email protected] and we’ll answer them in our next Hackernoon post!

Tags

The Noonification banner

Subscribe to get your daily round-up of top tech stories!

read original article here