YouTube Backs Down in a Big Way on 2020 Election 'Misinformation'
YouTube issued a statement Friday reversing a policy that had for more than two years all but silenced debate on the video platform regarding the integrity of the 2020 election.
In December 2020, following the certification of the presidential election results by enough states to ensure that Joe Biden was declared the winner, YouTube banned most claims of irregularities in the election, including claims of fraud or simple human or machine error, Axios reported.
YouTube said it had removed “tens of thousands” of videos from its platform since enacting the policy.
The company promised additional information in the future about how it would handle videos related to the upcoming 2024 election, but said leaving the previous policy in place could have the effect of “curtailing political speech without meaningfully reducing the risk of violence or other real-world harm.”
YouTube offered no specifics regarding how it had come to this decision, but said the change was effective immediately.
Axios said the change exemplified a Big Tech company “wrestling with how to balance curbing misinformation with freedom of speech ahead of the 2024 election.”
Forbes noted that both Twitter and Facebook had recently reinstated the accounts of former President Donald Trump, who has made claims of election fraud a central part of his public messaging since November, 2020.
In its statement, the video platform stressed that other portions of its “election misinformation policies” would remain unchanged, at least for the time being.
YouTube’s statement appears here in its entirety:
Today we are announcing a new set of updates on YouTube to address and protect our community — providing a home for open discussion and debate during the ongoing election season.
When we craft our policies, we always keep two goals in mind: protecting our community, and providing a home for open discussion and debate. These goals are sometimes in tension with each other, and there is perhaps no area where striking a balance is more complex than political speech. The ability to openly debate political ideas, even those that are controversial or based on disproven assumptions, is core to a functioning democratic society–especially in the midst of election season.
We first instituted a provision of our elections misinformation policy focused on the integrity of past US Presidential elections in December 2020, once the states’ safe harbor date for certification had passed. Two years, tens of thousands of video removals, and one election cycle later, we recognized it was time to reevaluate the effects of this policy in today’s changed landscape. In the current environment, we find that while removing this content does curb some misinformation, it could also have the unintended effect of curtailing political speech without meaningfully reducing the risk of violence or other real-world harm. With that in mind, and with 2024 campaigns well underway, we will stop removing content that advances false claims that widespread fraud, errors, or glitches occurred in the 2020 and other past US Presidential elections. This goes into effect today, Friday, June 2. As with any update to our policies, we carefully deliberated this change.
This specific aspect of our elections misinformation policy represents just one piece of a broad, holistic approach towards supporting elections on YouTube. Here’s what isn’t changing:
We are ensuring that when people come to YouTube looking for news and information about elections, they see content from authoritative sources prominently in search and recommendations. For example, following the 2020 US election, we found that videos from authoritative sources like news outlets represented the most viewed and most recommended election videos on YouTube. And our 2020 election information panels, with relevant context from voting locations, to live election results, were collectively shown over 4.5 billion times.
All of our election misinformation policies remain in place, including those that disallow content aiming to mislead voters about the time, place, means, or eligibility requirements for voting; false claims that could materially discourage voting, including those disputing the validity of voting by mail; and content that encourages others to interfere with democratic processes.
The rest of our policies continue to apply to everyone on YouTube, for all types of content, including elections. This includes policies against hate speech, harassment, and incitement to violence.
We know citizens take the integrity of the democratic process incredibly seriously, and so do we. We’ll remain vigilant as the election unfolds, as we did in 2020, and again in 2022. And we have an elections-focused team, including members of our Intelligence Desk, Trust & Safety and product teams, monitoring real-time developments and making adjustments to our strategy as needed. We’ll have more details to share about our approach towards the 2024 election in the months to come.
Truth and Accuracy
We are committed to truth and accuracy in all of our journalism. Read our editorial standards.
Advertise with The Western Journal and reach millions of highly engaged readers, while supporting our work. Advertise Today.