Vicious attacks against women on Twitter have reached crisis levels. A recent Pew research study found that women overall are disproportionately targeted by the most severe forms of online abuse. Women, Action, & the Media (WAM!) has released, “Reporting, Reviewing, And Responding To Harassment on Twitter, ” a new report that examines gendered harassment and abuse on the platform. WAM’s report finds that additional policy solutions and tools updates are needed to more effectively address tweet and delete harassment, dogpiling, and doxxing. View WAM’s Report
Open participation is power. While Twitter is a platform that allows for tens of millions of people to express themselves, the prevalence of harassment and abuse, especially targeting women, women of color, and gender non-conforming people, severely limits whose voices are elevated and heard. Examples of these attacks, the personal and psychological harm they cause, and their impact on women’s lives, are everywhere.
Last October, WAM! began to work closely with Twitter to identify problems and find solutions. In November, Twitter granted WAM! authorized reporter status, allowing WAM! to gather complaints to help address gendered online harassment. As an authorized reporter, WAM! escalated validated reports of gendered online harassment and abuse directly to Twitter, and tracked Twitter’s responses.
Our new report is the outcome of that project. The report provides in-depth analyses of data collected during this pilot initiative. It is a detailed examination of how online harassment functions on Twitter’s platform, and whether instances of abuse were effectively and efficiently addressed by Twitter during the period of our engagement in November 2014. Our goal in releasing this report and our recommendations, is to inspire constructive discourse and structural changes.
We value Twitter as a platform, and recognize the efforts Twitter has made to address online harassment and abuse since December 2014. We hope that this report provides a useful baseline of those efforts. However, while we appreciate Twitter’s public acknowledgement of the problem, and the steps they have taken to address some barriers to equality, our research and the reported experiences of our community show that there’s more work to do.
WAM! has successfully worked with technology platforms in the past. This is exemplified by our #FBrape campaign, that resulted in Facebook’s decision to ban gender-based hate speech, and partner with WAM! on this initiative.
Twitter and companies like it have a basic responsibility to users. We urge Twitter and other Tech leaders to use this research as a benchmark for measuring change. We hope they will take the vital and urgent steps needed to make social media platforms safer and freer for all people.
Building on the experiences of the people who reported harassment to WAM!, WAM!’s observations, and the report findings, WAM! recommends that Twitter:
- More broadly and clearly define what constitutes online harassment and abuse, beyond “direct, specific threats of violence against others” or “threats of violence against others, or promoting threats of violence against others” to increase accountability for more kinds of threatening harassment and behavior. 19% of reports were defined as “harassment that was too complex to enter in a single radio button.” See “Summary of Findings” and page 15 of the report.
- Update the abuse reporting interface, using researched and tested trauma-response design methods. Twitter should acknowledge the potential trauma that targets may experience; additionally, connecting users to support resources would go a long way in offering to inspire constructive discourse and structural changes. See page 34 of the report.
- Develop new policies which recognize and address current methods that harassers use to manipulate and evade Twitter’s evidence requirements. These policies should focus particularly on the “tweet and delete” technique, where harassers share, but quickly delete, abusive comments and information.The problem of evidence prevents comprehensive resolution for all reports acknowledgement and validation. See page 35 of the report.
- Expand the ability for users to filter out abusive mentions that contain “threats, offensive or abusive language, duplicate content, or are sent from suspicious accounts,” to counter the effect of a harassment tactic known as dogpiling– where dozens, hundreds or even sometimes thousands of Twitter users converge on one target to overwhelm their mentions and notifications. This kind of filtering would be opt-in only, enabling users to decide whether to use it or not. See “Summary of Findings: Dogpiling” in the report.
- Hold online abusers accountable for the gravity of their actions: suspensions for harassment or abuse are currently indistinguishable from suspensions for spam, trademark infringement, etc. This needs to change. Ongoing harassment was a concern in 29% of reports, where reporters mentioned that harassment started more than three weeks before the report. See page 15 of the report.
- Diversify Twitter’s leadership. Twitter’s own 2014 report reveals that its company leadership is 79% male and 72% white. Systemic changes in the hiring and retention of diverse leaders will likely expand internal perspectives about harassment since women and women of color, disturbingly absent, are disproportionately targeted online.
Developing the aforementioned policy solutions, updates, and tools, will help Twitter (and potentially other platforms) more effectively address online harassment and abuse.
We invite Twitter to lead on combating gendered harassment by adopting these recommendations, and by continuing to communicate publicly and openly about online abuse (including gendered harassment), as a mission-critical priority.
We also hope that readers of this report will take into account how online harassment and abuse impacts women and gender non-conforming people everyday, and work with us to fight it.
Join the conversation on Twitter. Use the hashtag #HarassStats.
Executive Director, Women, Action & the Media