YouTube said in a blog post that it made the decision to balance its twin goals of “protecting our community and providing a home for open discussion and debate.” The decision, which comes ahead of the 2024 midterm races, undoes a policy implemented in December 2020, after President Joe Biden won the election.
“Two years, tens of thousands of video removals, and one election cycle later, we recognized it was time to reevaluate the effects of this policy in today’s changed landscape,” YouTube wrote. “In the current environment, we find that while removing this content does curb some misinformation, it could also have the unintended effect of curtailing political speech without meaningfully reducing the risk of violence or other real-world harm.”
The new rule will go into effect on Friday, June 2.
In the 2020 elections, the company faced backlash for delayed action when it came to labeling and removing videos that showed misinformation or that falsely claimed widespread voter fraud. After the Jan. 6, 2021, attack on the U.S. Capitol, YouTube said it would begin suspending channels that make false claims about widespread voter fraud
As of March 2023, YouTube had already lifted restrictions placed on Trump’s account following the Jan. 6 insurrection.
YouTube said there are still aspects of its election misinformation policy that remain, including highlighting authoritative sources in search and recommendations and prohibiting posts that aim to mislead voters on where and how to vote.