YouTube and Twitter have had their hands full already in 2019 in their respective fight to remove terrorist and specific terrorism-related content, based on recent letters published by the US House Committee on Homeland Security. The letters were in response to requests from several representatives, including Chairman of the Subcommittee on Intelligence and Counterterrorism Max Rose and Chairman of the Committee on Homeland Security Bennie Thompson, asking companies to divulge details about their efforts to stop terrorism and the amount of funding allocated to those efforts.
YouTube parent company Google says it has spent hundreds of millions of dollars on a yearly basis to review and take down the offending content that violates its policies on the matter. For YouTube specifically, over a million videos were reviewed in the first three months of 2019 alone and approximately 90,000 of those were ultimately found to violate terrorism policies.
For Twitter, no dollar amount has been provided and the company also claimed to have no data to share for 2019. The company did note, however, that more than 1.4 million accounts have been suspended between August of 2015 and June of 2018, with a total of 205,156 accounts suspended between January and the end of June of 2018.
This was not a satisfactory response
Facebook and Microsoft were among companies requested to explain their efforts to combat terrorism as well but neither appears to have made an official response to the inquiry. None of the companies' responses were satisfactory to meet the request of the US House Committee on Homeland Security either, the representatives lamented.
Pointing directly to Facebook, the representatives went further to note that Facebook has not only failed in how it has handled ongoing terroristic threats -- such as was the case with the New Zealand mass shootings in March -- but that it has "admitted as much." The company was responsible for allowing broadcast of the tragedy to be live-streamed on its site, only putting an end to the stream after the attack was over.
According to the US representatives, a "full accounting" of efforts are needed to suppress the promotion and spread of terrorism. Not only did two companies not respond but those that did only provided "broad platitudes and vague explanations." The ubiquitousness of the tech giants, the officials say, places the onus on those companies to respond appropriately.
This isn't a new fight for YouTube or anyone else
While the representatives requesting details from the tech giants are certainly disappointed in the responses or lack thereof, this isn't an entirely new fight for any of those involved.
Each of the companies in question has taken its own approach with regard to ending the spread and promotion of similar content. But that may be most easily argued with regard to YouTube and Google's ongoing struggle to push back against the limits of what kinds of 'free speech' they'll allow.
Setting aside the debate about whether or not the company is cracking down too hard on any number of other topics or viewpoints, it's battle against extreme or dangerous content has been ongoing for several years now.
In 2017, for instance, the company began implementing a then-new method for diverting views away from 'extremist' content by default. Dubbed "Redirect Method," YouTube would effectively redirect extremist content to videos explicitly highlighting counterarguments to the promoted views.
That effort doesn't stop with terrorism either. In a more recent move from the media distributor, the company clarified and redefined policy terms in early 2019 to try and stave off a wave of dangerous pranks and "challenges."