Twitter offers moderators the possibility to report tweets (including multiple tweets at the same time), direct messages, targeted tweets within a conversation or entire conversations (threats, replies below a tweet), moments, and accounts taking part in abusive behavior.
Please click here to learn how to report a tweet.
When would audience moderators normally report?
Audience moderators managing media outlets’ official Twitter accounts say they report tweets that clearly violate both Twitter’s community standards and their own, especially when the tweets contain violent imagery.
Audience moderators believe that in some cases when users tag the media organization on a tweet with images of violence they don’t aim at intimidating or threatening the outlet – as oppose to targeted attacks against an individual journalist – but to draw the attention of the moderator, get the newsrooms to cover a specific story or to disseminate the, often extreme, ideology behind these tweets.
It must be noted that Twitter has endured criticism with regard to its response rate and decision-making processes regarding reported content. Moderators say that Twitter’s removal policy is inconsistent. For example, a removal request might first be declined by Twitter with the justification that the platform does not consider that it breaches their safety policies, but when the moderator reports the same tweet again, the deletion request is accepted.