youtube disinformation policy

YouTube Disinformation Policy As Election Season Begins

As the election year as kicked off on Monday with the Iowa Caucus, pressure is increasing in social media against disinformation on various platforms. YouTube has been closely watched as “deepfake” videos have been making the rounds a lot in the political realm. Yes, those deepfakes of Tom Selleck as Indiana Jones are entertaining, or Bill Hader’s impression of Al Pacino with Pacino’s face over Hader’s body. However, poilitcal information is a different beast altogether.

A new blog post issued by Leslie Miller, vice president of government affairs and public policy at YouTube, reiterated some of the company’s key policies. None of the policies are new, but the company is hoping to present the details of its various investments in fighting disinformation related to the election. The blog post essentially acts as an overview of everything YouTube has worked toward over the last couple of years, including its approach to manipulated videos and voter suppression campaigns.

When it comes to manipulated videos, YouTube will remove “content that has been technically manipulated or doctored in a way that misleads users (beyond clips taken out of context) and may pose a serious risk of egregious harm.”

Social media has become a greater part of the election landscape, particularly in the election of Donald Trump. The use of widespread misinformation regarding Hillary Clinton’s health, the promotion of conspiracy theories leading to Pizzagate, and countless others does have an impact uninformed and uneducated voters. Let’s not forget that President Trump loves uneducated voters. People that are unable/unwilling to discern what is truth and what is not.

However, YouTube doesn’t go into specifics in what actually constitutes a violation. It seems to be left to a video-by-video basis. What is a violation of their code are videos that tell people incorrect information about voting, including trying to mislead people by using an incorrect voting date, are also not allowed. Neither are videos that advance “false claims related to the technical eligibility requirements for current political candidates and sitting elected government officials to serve in office.” YouTube will further terminate channels that attempt to impersonate another person or channel, or artificially increase the number of views, likes, and comments on a video.

“YouTube remains committed to maintaining the balance of openness and responsibility, before, during and after the 2020 U.S. election,” Miller wrote. “We’ll have even more to share on this work in the coming months.”

This brings us back to a “who watched the watchers” thing. As their decisions are not paramount, we need to tell them when they cross the line. I’m not talking about the entire Twitter universe, since that makes up on a fraction of the voting block. To be honest, I don’t think many within the Twitterverse actually vote. Although, it is up to us to remind them what we determine what is right and what is wrong. Not to be spoon fed the information but inform ourselves what is true and what is a deepfake.