Is Zuckerberg right to allow political ads to run with misinformation?
- Misinformation is defined as “false information that is spread, regardless of whether there is intent to mislead.”
- Mark Zuckerberg launched the groundbreaking social media platform Facebook on February 4, 2004. He was a Harvard sophomore at the time.
- Zuckerberg announced Thursday the company’s “New Steps to Protect the U.S. Elections,” which include denying new political ads in the week before the election, removing any COVID-related ads aimed to deter voting over health risks, and attaching informational labels to posts discussing the legitimacy of voting methods, and directing any victory-related posts from either candidate to the official results from Reuters and the National Election Pool.
- This isn’t the first time Facebook or its founder has been in the national conversation. In 2018, Congress brought Zuckerberg in to testify over Facebook’s privacy policies.
- Zuckerberg and his wife, Priscilla Chan, are donating $300 million to American elections to encourage democracy. Critics mention the irony of the donation given Facebook’s “past failures in protecting the integrity of elections.”
Facebook is a private company and has an absolute right to set terms for its users. Facebook has the authority to ban political ads containing misinformation or any information for that matter, yet it should not exercise this authority. It should allow users to determine for themselves what information to believe. Political ads frequently contain more than just hard facts, in addition to politicians' voting records. They make statements about value-judgments and assert opinions about the propriety of specific political actions. These are matters of personal interpretation and may not be clear-cut right and wrong and, therefore, can't be easily branded as 'true' or 'untrue.'
Distinguishing between 'facts' and 'opinion' can be difficult. When this is unclear, labeling speech as 'misinformation' and banning it risks giving greater legitimacy to the moderators' viewpoints than those they're censoring. Mark Zuckerberg himself seems to understand this. This can lead to the suppression of particular views, making them seem less popular than they are or moderators delegitimizing 'non-approved' viewpoints, which risks interfering with election outcomes or moderators claiming ads by political opponents are outright false.
It also suggests Facebook users can't independently analyze information and make decisions about what is misleading and should be disregarded, which hardly bodes well for democratic participation in elections. Facebook already employs fact-checkers on news articles, many of which cover the same topics as political ads. This alerts users to sources that might contain nonfactual information but leaves the decision over whether to consume and believe them up to the reader. And that's as it should be.
Facebook is an entity of such enormous size and scope never seen before; any organization with the reach to deliver content to such a high percentage of the population has a duty to ensure it is not spreading misinformation. This is particularly acute in Facebook's case as it's swallowed up local news organizations in many American towns. Citizens living in these regions get their news almost exclusively from their Facebook pages and feeds.
As Facebook eliminates competition from vetted news sources, it has a duty to ensure the void is not filled with blatant misinformation. The 'free speech argument' (saying Facebook should not moderate content) has already been decided by Facebook itself. They will get involved, as the company has already issued a series of steps they currently undergo to clean up hate-speech and other incendiary content. By recognizing it has a duty to combat online hate, Facebook has acknowledged it must actively engage with and moderate the content it distributes. It follows suit that Facebook should not allow misinformation on something as important as national elections.
Plus, for Mark Zuckerberg to allow misinformation on Facebook would detrimentally impact the company's financial stability. Over 400 companies have either temporarily or permanently removed their advertisements over concerns about misinformation. According to the NY Times, advertising dollars are an essential part of Facebook's revenue, making up approximately 98%. Americans have made a deal with Facebook, allowing it to gain enormous success while overrunning traditional media outlets. Mark Zuckerberg and Facebook owe it to the population to actively remove misinformation and own their responsibility.
0 / 1000