Policy and Public Pressure: Technology Companies Respond to Charlottesville

Censorship, First Amendment, Policies, Political Viewpoint

By: guest blogger Kate Thompson

Policy and Public PressureCharlottesville, Virginia, saw violence erupt at a white supremacist march on Aug. 12, 2017. The “Unite the Right” rally was organized to protest the removal of a statue of Confederate General Robert E. Lee. During the rally, James Alex Fields, Jr., ran his car into a group of counter-protesters, killing one and wounding at least 19. The events in Charlottesville have heightened public awareness of white supremacist organizations and their music, merchandise and online presence. There has also been a renewed interest in leading technology company platforms and the ways in which they host and profit from the activities of groups that identify with white supremacy.

The online presence of such groups is not a new phenomenon, nor is the need for tech companies to grapple with the extent to which they will support such organizations. The Southern Poverty Law Center published a 2014 report on “Music, Money and Hate,” identifying white supremacist bands and calling on major music providers to have them removed. In response to the report, Apple removed the listed bands from its iTunes platform.

Spotify, however, did not follow suit. After failing to take action after the 2014 report, Spotify was again the subject of criticism when Digital Music News published an Aug. 14 article titled “I Just Found 37 White Supremacist Hate Bands on Spotify.” Less than a day later, Spotify had removed most of the artists’ music and released a statement saying “Spotify takes immediate action to remove any such material as soon as it has been brought to our attention.”

Companies that host the websites of white supremacist groups or facilitate their merchandise sales have also reexamined their positions. Apple’s CEO Tim Cook stated in an email to employees “I disagree with the president and others who believe that there is a moral equivalence between white supremacists and Nazis, and those who oppose them by standing up for human rights.” The company’s Apple Pay program discontinued supporting merchandise sales to websites selling clothing with Nazi and white supremacist themes.

Matthew Prince of Cloudflare. https://techcrunch.com/2017/08/17/cloudflare-ceo-calls-for-a-system-to-regulate-hateful-internet-content/Matthew Prince, CEO of Cloudflare, an internet security provider, had previously taken the position that content decisions should not be made by security providers, as removing a site from their service would not remove it from view but merely make it more vulnerable. However, Cloudflare stopped supporting white supremacist site the Daily Stormer on Aug. 16, 2017. It seems the potential for negative optics weighed equally as heavily on his mind as First Amendment concerns. As he stated on Cloudflare’s blog, “the tipping point for us making the decision was that the team behind the Daily Stormer made the claim that we were secretly supporters of their ideology.”

While defending the company’s right to remove its support for Daily Stormer, he notes the complications that arise from taking a stand on content, saying that “after today, make no mistake, it will be a little bit harder for us to argue against a government somewhere pressuring us into taking down a site they don’t like.”

Technology companies that provide platforms are in an unenviable position with regard to the content of their customer’s music, merchandise and websites. Striving to remain neutral while maintaining a healthy brand presence puts important issues of intellectual freedom in the hands of administrators who may have little background in free speech issues. Dave Willner, content manager for Facebook and the primary decision-maker with regard to what constitutes prohibited hate speech on the social media site, noted in the New Republic “this is probably what a lot of people who go to law school want to do … I ended up doing it by accident.” The need to protect the company’s image can also leave it vulnerable to public pressure, particularly after divisive events such as those in Charlottesville.

Furthermore, private companies are not bound by the terms of the First Amendment but by the murkier terms of the service agreement.

The service agreement (often referred to as the “Terms and Conditions of Use”) outlines prohibited conduct on the part of the consumer. Just as a carefully crafted collection policy can help a library withstand challenges to materials, a carefully crafted agreement can help a company justify the removal or retention of material under debate. Apple Pay prohibits “a website that … promotes hate, violence, or intolerance based on race, age, gender, gender identity, ethnicity, religion, or sexual orientation.” Spotify’s policy is more vague, prohibiting material that “is offensive, abusive, defamatory, pornographic, threatening, or obscene.” Cloudflare simply reserves the right to suspend a customer’s service “at any time if such Application negatively impacts the goodwill related to our Service or our User community as determined in Cloudflare’s sole and exclusive discretion.”

While the flexibility inherent in these agreements may be of service to tech companies who wish to protect their image in the aftermath of a crisis, the ability of a small number of companies to control what the public sees is a significant responsibility. Removal of a website, for example, makes it harder for the public to create an informed response to debated material, and archival considerations arise as well if such removal impacts future historians’ work on the subject.

The American Library Association’s Office for Intellectual Freedom suggests a policy creation
model that might be instructive for tech companies when it notes that policies “should be (1) written, (2), objective, (3) consistently and equitably enforced without regard to the identity or beliefs of the user, and (4) reasonable … ” Technology company policy that is transparent and consistently enforced is an important protection in heated moments, allowing a company to craft a policy that reflects their values and can be used to guide content decisions.


Kate ThompsonKate Thompson is a graduate student in the Master of Library and Information Studies program at the University of Alabama. She is interested in matters of inclusivity with regard to individuals with sensory needs, and researches public libraries and their programming for Autism Spectrum Disorder. She is the recipient of the 2016 Iowa Association of College and Research Libraries Conference Scholarship, the Iowa Library Association’s Susan Lerdal Memorial Scholarship and the Judith F. Krug Memorial Fund Intellectual Freedom Course Scholarship.

Leave a Reply