Facebook has rolled out a new feature to Messenger that allows users to report offensive messages from others directly through the Android app. The same reporting tool was previously only accessible through the Facebook reporting section or on the desktop version of Messenger. With the latest update, Messenger users will now be able to bring Facebook's attention to conversations that potentially break the social networking site's community standards.
Messenger's new reporting tool on mobile comes in the form of a new section of the app which can be accessed once the name of the contact with whom a user is chatting with is tapped. The reporting tool is formally known as the "Something's Wrong" tab under which a number of categories can be selected for conversations being reported such as harassment, hate speech, suicide danger, or self-injury, sharing inappropriate things, unauthorized sales, or fake account suspicions. Aside from being able to report conversations on Messenger for mobile, users also have the option to block the person being reported. Hadi Michel, Product Manager for Messenger, said in a blog post that Facebook's Community Operations team will review reports from across the world in more than 50 languages. The goal is to speed up the way Facebook's team addresses a range of issues on its messaging platform.
The new reporting options for Messenger on mobile devices is part of a broader effort by Facebook to more effectively police its instant messaging solution and improve the overall user experience. In April, Facebook released internal guidelines for its 7,500 content reviewers to determine which posts and other content to remove from the platform. The framework is also understood to have been created in response to issues of inconsistency surrounding Facebook's content review practices. The Menlo Park, California-based company vowed to continue improving its policies in an effort to make Facebook a safer place for everyone.