abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb
Article

21 Dec 2018

Author:
Amnesty International

Troll Patrol findings: Using crowdsourcing, data science & machine learning to measure violence & abuse against women on Twitter

These findings are the result of a collaboration between Amnesty International and Element AI,  a global artificial intelligence software product company. Together, we surveyed millions of tweets received by 778 journalists and politicians from the UK and US throughout 2017 representing a variety of political views, and media spanning the ideological spectrum... Amnesty International has repeatedly urged Twitter to publicly share comprehensive and meaningful information about reports of violence and abuse against women, as well as other groups, on the platform, and how they respond to it. On 12 December 2018 Twitter released an updated Transparency Reportin which it included for the first time a section on 'Twitter Rules Enforcement'. This was one of Amnesty International’s key recommendations to Twitter and we see the inclusion of this data as an encouraging step. We are disappointed, however, that the information provided in the transparency report does not go far enough... Our study found that 7.1% of tweets sent to the women in the study were problematic or abusive. This amounts to 1.1 million problematic or abusive mentions of these 778 women across the year, or one every 30 seconds on average. Women of colour were more likely to be impacted - with black women disproportionately targeted with problematic or abusive tweets.

... Amnesty International and Element AI’s experience using machine learning to detect online abuse against women highlights the risks of leaving it to algorithms to determine what constitutes abuse... Human judgement by trained moderators remains crucial for contextual interpretation... Amnesty International’s full set of recommendations to Twitter are available here

Privacy information

This site uses cookies and other web storage technologies. You can set your privacy choices below. Changes will take effect immediately.

For more information on our use of web storage, please refer to our Data Usage and Cookies Policy

Strictly necessary storage

ON
OFF

Necessary storage enables core site functionality. This site cannot function without it, so it can only be disabled by changing settings in your browser.

Analytics cookie

ON
OFF

When you access our website we use Google Analytics to collect information on your visit. Accepting this cookie will allow us to understand more details about your journey, and improve how we surface information. All analytics information is anonymous and we do not use it to identify you. Google provides a Google Analytics opt-out add on for all popular browsers.

Promotional cookies

ON
OFF

We share news and updates on business and human rights through third party platforms, including social media and search engines. These cookies help us to understand the performance of these promotions.

Your privacy choices for this site

This site uses cookies and other web storage technologies to enhance your experience beyond necessary core functionality.