More than a month after the U.S. presidential election, YouTube says it will start removing newly uploaded material that claims widespread voter fraud or errors changed the outcome.
The Google-owned video service said Wednesday that this is in line with how it has dealt with past elections. That’s because Tuesday was the “safe harbor” deadline for the election and YouTube said enough states have certified their results to determine Joe Biden as the winner.
But this election was different from past elections and YouTube has been widely criticized for not doing more to prevent misinformation from spreading on its platform. Unlike Twitter and Facebook, which put measures in place — with some success —- YouTube had until Wednesday stood by its decision to allow baseless claims about election fraud to stay up.
There is no evidence of widespread fraud in the 2020 election. Election officials confirmed there were no serious irregularities and the election went well. Attorney General William Barr said last week the Justice Department has not identified voter fraud that would change the presidential election.
That hasn't stopped President Donald Trump and his supporters from claiming that is the case. Conservative news sites and YouTube accounts have been instrumental in spreading these claims, like a 90-second cellphone video that showed a man closing the doors of a white van and then rolling a wagon with a large box into a Detroit election center. It was intended to show fraud, but was quickly discredited by news organizations and public officials — the man was a photojournalist hauling camera equipment, not illegal ballots.
Still, the damage was done, reinforcing voters' belief that the election was marred by fraud and irregularities.
YouTube said it is trying to strike a balance between “allowing for a broad range of political speech and making sure our platform isn’t abused to incite real-world harm or broadly spread harmful misinformation.”