It was difficult this election season to keep up with social media platforms’ mis- and disinformation policies, causing some intellectual whiplash that still isn’t over. When I discussed these issues with students, most seemed to agree that social media companies have the right to make their own policies related to content curation (it’s not as if they were altering the Bill of Rights).
Because major platforms like Twitter, Youtube, Facebook, and Instagram committed themselves to spreading only factual information related to the election and its aftermath, the public saw half of President Trump’s tweets flagged for false information in the span of only one day because they were, well, false. Consider this piece in which Twitter reflects on the success of labeling election misinformation and what it will keep in place.
Only 0.2% of tweets related to the election from Oct. 27 to Nov. 11 were restricted, revealing that free speech at large is not under attack.
Only in 2020 would we see this headline from Politico, outlining “How we know Joe Biden won the 2020 presidential election,” which makes me both appreciate its conciseness and feel despondent about its necessity.
However, we’ve entered some gray information areas: if information is proven to be false, you have the right to believe it, but do you have the right to share it?
Twitter says no, according to its civic integrity policy. It’s not just about free speech; it’s about spreading malicious disinformation designed to undermine how our democracy functions at fundamental levels.
Enter Parler, marketed as the free speech deus ex machina saving us from the free speech fascists (although the Wall Street Journal reported on Facebook’s algorithms limiting left-leaning sites).
It’s “the world’s town square. Speak freely and express yourself openly, without fear of being “deplatformed” for your views. Engage with real people, not bots.”
My Facebook feed has featured a mass exodus of people looking for platforms that don’t censor information. Given the timing of this exodus, I assume they want a safe place in which to spread election disinformation, and they are enamored of Parler. (View this short information video from First Draft and a recent NPR article.)
Here are the community guidelines (linked as a typed and scanned Word document), and more detailed guidelines. Defamation guideline: “We will remove parleys or comments found to be defamatory by a court of law having jurisdiction over Parler. Otherwise, we will avoid making our own determinations about the truth or falsity of statements posted on Parler.”
Just that statement presents a strawman fallacy: Twitter itself does not determine the falsity of tweets; the tweets themselves are factually incorrect, indeterminate of those affixing the label.
One of Parler’s stated values is “content curation exacerbates hate.” Again, this places the blame on the curators rather than on the false information itself. Get mad at Twitter for censorship, Parler encourages, not at the divisive, false information itself.
Even Parler users face their share of determining what’s real and false, particularly since not much is censored, causing confusion and anger (George Soros majority owner of Parler? Nope.).So it’s just a different type of disinformation.
Wired.com recently published this piece, discovering that Parler seems to be intended more for already-amplified right-wing pundits’ voices (specifically, those already censored) rather than fostering the town square feel that would be a true democracy as it claims.
Ultimately, the question remains of whether or not free speech guarantees the right to spread information which can be proven false and is also damaging to people or institutions. People who know what they’re doing (more so than conspiracy theorists on social media) are going to great lengths to verify that the election process was legitimate: see the Department of Homeland Security, the Cybersecurity and Infrastructure Security Agency.
None of this concern is hyperbolic. A perfectly secure, legitimate election has now been cast into doubt for millions of Americans. As I was writing this post on Tuesday, November 17, President Trump fired the Director of Cybersecurity and Infrastructure Security Agency Chris Krebs because he ignored “massive improprieties and fraud–dead people voting, Poll Watchers not allowed into polling locations, “glitches” in the voting machines which changed votes from Trump to Biden, late voting, and many more.”
There is a difference between partisan debate on issues and the proliferation of verifiably false information. The attempt to flag mis- and disinformation does not seem to unnecessarily restrict free speech on social media platforms when considered in conjunction with what is currently happening in the Trump administration.
Jamie M. Gregory is a National Board Certified Teacher in Library Media working as the Upper School Librarian and journalism/newspaper teacher at Christ Church Episcopal School in Greenville, SC. She is the recipient of the 2021 Media Literacy Teacher Award from the National Association for Media Literacy Education.