Facebook has long struggled with content monitoring and privacy-related issues. A new report discusses how the company knew its algorithms made users divisive yet did nothing to fix it.
The Wall Street Journal (via) talks about internal documents and presentations dating back to 2016. Company officials were reportedly aware of the fact that Facebook’s recommendation algorithms caused divisiveness. However, the report adds, the executives brushed aside any potential solutions. The report names Facebook’s VP of Global Public Policy, Joel Kaplan, as a strong voice against the proposed reforms.
Kaplan reportedly cited the example of any policy changes potentially harming groups like the Girl Scouts of America. Fear of pushback from conservative voices was also cited as one of the reasons for Kaplan’s hesitancy.
Facebook executives brushed aside recommendations citing fears of alienating the other side
In 2017, Facebook’s former policy head, Chris Cox set up and led a new task force called Common Ground. He also created something known as “Integrity Teams” within the company. These groups of engineers and researchers explored common pain points in the platform and finding effective remedies.
Separately, a 2018 document reveals that the teams would not look to change the minds of users but instead help with the “humanization of the ‘other side.'” The teams recommended moderators to put people engaging in an intense argument within a subgroup. This was to limit the group’s visibility to other users. They also began working on a feature wherein the number of replies to a thread would be limited.
Meanwhile, Integrity Teams paid attention to the newsfeed. They discovered that most of the harmful behavior patterns come from a small portion of extreme elements on both sides of the political spectrum. They also found that restricting clickbait topics would disproportionately affect conservative sites. This led to fears of alienating a large portion of the platform’s users.
A 2016 internal presentation attributes a surge in German users joining extremist groups to Facebook’s recommendation algorithms and Russian bots. The presentation adds that 64% of the users joined the groups based on the “Groups You Should Join” and “Discover” sections.
WSJ notes that the “partial victories” here came in the form of new penalties for publishers sharing misinformation or directing users to ad-filled pages. However, the report clarifies that Facebook knew how divisive it could be and yet did very little.