YouTube will remove content alleging fraud or errors changed the outcome of the U.S. presidential election starting today, according to a statement by the company.
The video-sharing platform said the move comes because of Tuesday’s safe harbor deadline for the election, when enough states certified their election results for the president-elect.
YouTube will remove content uploaded from Wednesday onward.
“As always, news coverage and commentary on these issues can remain on our site if there’s sufficient education, documentary, scientific or artistic context,” the statement reads.
Censorship or conspiracy theory?Trump supporters say Facebook and Twitter censor them but conservatives still rule social media
Trends: Election, coronavirus, Kobe Bryant among Google’s top US trending searches in 2020
As an example, the company said, it will remove videos claiming that a presidential candidate won the election because of widespread software glitches or counting errors.
The platform has removed more than 8,000 channels and misleading election-related videos since September.
According to YouTube’s policies, users are not allowed to post content aiming to mislead voters about the time, place, means or eligibility requirements for voting, or false claims that could materially discourage voting.
Tech giants such as Facebook and Twitter have also implemented measures to tackle election misinformation. Facebook said it would demote content on Facebook and Instagram, including debunked claims about voting, as well as limit the distribution of live videos that may relate to the election. Twitter introduced prompts to U.S. users “that preemptively address topics that are likely to be the subject of election misinformation.”