To combat toxicity on the platform, social media website Twitter is reportedly developing a feature to makes users reconsider sending messages that its system detects as “mean” or offensive. Initial tests show that 34 percent of users who received the warning chose to revise their message or not send it at all.
NPR reports that Twitter is attempting to make its users more conscientious about the language they use on the platform and is encouraging positivity by implementing a new feature that will detect “mean” replies on its service before a user presses send.
When Twitter detects that a user is about to post something that could be seen as offensive, an automatic prompt will be displayed that states: “Want to review this before tweeting?” The user is then offered three choices: tweet, edit, or delete.