Facebook has increased the annual budget for its content review initiatives for 2018 by hundreds of millions of dollars, a recent report from The Wall Street Journal reveals, citing sources privy to the matter. It is, perhaps, worth noting that Facebook spent approximately $150 million in 2016 on its community operations unit, which is responsible for monitoring its content moderators, according to WSJ. An industry source claims that the Menlo Park, California-based company raised funding for its community operations to $220 million last year, growing the budget for such endeavors by almost half. For 2018, the community operations team reportedly asked for a total budget of $770 million.
The social networking site is supposedly ramping up efforts to hire as many as 10,000 content reviewers by the end of 2018 using portions of that money as part of a broader move to eliminate contents that violate the site's community standards such as hate speech, violence, and more. The goal is to remove the sources and perpetrators of these offensive contents over the long term. As part of its policing efforts, Facebook also reportedly appointed two executives to manage its content review operations and create solutions that more effectively analyze how widespread hate speech has gone on the platform and monitor how content moderators implement the site's content standards. The community operations division is tasked with developing automated tools that help users report offensive content and send them to reviewers. The policing effort also covers Facebook's instant messaging platform, Messenger, which just received a new feature that allows users to flag offensive messages from others directly through the Android app, helping them to bring Facebook's attention to conversations that potentially break the social networking site's community standards.
Facebook's endless battle with problematic content might have stemmed from the fact that its content reviewers speak only a little over 50 languages while the site's services are available in more than 100 languages. That somehow allows offensive content to persist on the platform for a long period especially in regions where Facebook has little knowledge of the language being spoken. Facebook even reportedly acknowledged late last year that its content review methods are inconsistent, with content reviewers having failed to take down a number of posts considered to be hateful or sexist speech.