Readers likely remember when in early December 2020, Google-owned social media platform YouTube began enforcing a new policy about what it considered “misinformation” about the results of the 2020 presidential election. The platform admits that due to the policy, it took down user videos numbering in the “tens of thousands.”
Now, the Big Tech company has announced it’s doing an about-face on the policy—beginning immediately. YouTube wrote in a blog post Friday that it will stop its policy of removing the videos from the platform:
As of June 2, 2023, YouTube has reversed that decision: The video giant announced that it “will stop removing content that advances false claims that widespread fraud, errors or glitches occurred in the 2020 and other past U.S. Presidential elections.”
The post continued, giving their reasoning:
The Google-owned service, in an unsigned blog post Friday, tried to explain it this way: “In the current environment, we find that while removing this content does curb some misinformation, it could also have the unintended effect of curtailing political speech without meaningfully reducing the risk of violence or other real-world harm.”
YouTube also noted, “As with any update to our policies, we carefully deliberated this change.”
Looking towards the 2024 election season that is just getting underway, the company wrote:
“The ability to openly debate political ideas, even those that are controversial or based on disproven assumptions, is core to a functioning democratic society — especially in the midst of election season.”
YouTube, perhaps in an attempt to assuage criticism from leftists and their media allies, cautioned that not all of its “misinformation” rules are being wiped away by this move:
….YouTube said its other election misinformation policies remain in place, including those that prohibit content “aiming to mislead voters about the time, place, means, or eligibility requirements for voting” as well as false claims that could “materially discourage voting, including those disputing the validity of voting by mail” and videos that encourages others to interfere with “democratic processes.”
Here’s what isn’t changing: We are ensuring that when people come to YouTube looking for news and information about elections, they see content from authoritative sources prominently in search and recommendations.
This part, like the earlier insistence on retaining other policies on “misinformation” about vote by mail and ideas that clash with their definition of “interfer[ing] with democratic processes,” seems like an attempt to give them wiggle room to keep making political decisions and picking sides during an election. Who decides which sources are “authoritative”—and which ones are spreading “disinformation”? Who decides which articles get preferred ranking and which ones are hidden many pages back in search results? I don’t have to explain why I’m asking these questions, of course.
One reason this is concerning is that we’ve seen the company in action with our own eyes. Remember when Project Veritas busted Google and YouTube in mid-October of 2020 for the company’s “corrupt attempts to sway the results of the U.S. election,” as we wrote. Only time will tell if that behavior will change as we move towards 2024.