Web giant Google has come under fire for allowing fake news and misinformation to spread on its services before, and now it seems that its subsidiary, video giant YouTube, is the newest arm of the company to have the same problem. YouTube's Trending section is curated with algorithms, which could explain why a conspiracy video regarding the recent Marjory Stoneman Douglas school shooting made it into the section. The factual content of the video is largely up for dispute, as is just about anything surrounding a tragedy in the news, but the video uses news footage from a credible outlet as evidence of an unproven conspiracy theory Teen survivor David Hogg, because of a previous news appearance in California, is accused of being a paid crisis actor, a common point of contention surrounding survivors of mass shootings that is often used to argue that the shootings themselves are either orchestrated or entirely fake.
The crisis actor angle in regards to a prominent tragedy fresh in the national consciousness makes this incident sting a bit more than it would otherwise, but the core conceit is the same regardless of the content; fake news, being defined as unreliable reporting, intentional misinformation, or content that has a high likelihood to misinform by accident if it's found in a prominent place where people may encounter it without trying, got into Trending and amassed tons of views before YouTube was able to find out about it and take it down.
Both Google and YouTube have had issues with content recently, and as a result, have promised to bring on more human eyes to look over content before it can reach viewers at large. In YouTube's case, incidents ranging from Logan Paul's infamous "suicide forest" video to the rash of "Elsagate" videos targeting children forced the platform to raise the bar for entry and crack down on bad acts by top stars. Google has yet to issue any kind of statement on the matter as of this writing, but press coverage and public outcry will almost certainly mean that it will step up to say something in the near future. YouTube is a complex platform with a very wide range of possibilities in regards to content moderation issues, and the fact that a global corporation is having a hard time keeping up with the massive tide of user-generated content is a testament to that fact.