The Electronic Closet: Investigating Accusations of Queer Suppression on Facebook and Instagram
By: Guest contributor Kevin Talmer Whiteneir Jr.
Social media platforms like Facebook, Instagram, and Twitter have become global sites of discovery. Millions engage each other and the platforms’ parent corporations via verbal, visual, sonic, and other sensory methodologies, forging interpersonal connections, discovering others with shared interests, and discovering new things about themselves.
Users can discover potential friends, public figures, and businesses via complex search engines that employ various facets like hashtags or algorithms. These tools affect the visibility of these entities and interests. This has recently incited accusations that Facebook and Instagram engines are actively and covertly suppressing the content of queer and other marginalized audiences.
In February 2019, the Trump Administration signed into law the Stop Enabling Sex Traffickers Act (SESTA) and Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA), intended to fight human trafficking and sex violence by making websites culpable for user content. In an article for Vox, Aja Romano writes:
… the bills also poke a huge hole in a famous and longstanding “safe harbor” rule of the internet: Section 230 of the 1996 Communications Decency Act. . . it holds that “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”
. . . But FOSTA-SESTA creates an exception to Section 230 that means website publishers would be responsible if third parties are found to be posting ads for prostitution — including consensual sex work — on their platforms. The goal of this is supposed to be that policing online prostitution rings gets easier.
However, opponents of the FOSTA-SESTA act argue that the act potentially endangers those it was intended to protect by removing spaces that allow for screening potentially dangerous clients. Other critics have argued that the wide-sweeping language of FOSTA-SESTA promotes censorship and endangers what many believe is the open access and democratic infrastructure of the internet itself.
The subsequent internal suppression on websites, especially social media and other content sharing sites like Facebook, Instagram, Tumblr, Craigslist, and others, justified these fears. Romano writes that in anticipation of FOSTA-SESTA’s enforcement, Craigslist’s personal ads category was removed in fear of allegations of illicit sex work advertisement. In December 2018, Tumblr instituted an adult content ban that purged profiles and made posts undiscoverable and visible only to the poster, if not removed altogether. Accusations that Facebook and Instagram’s algorithm are weaponized to suppress content produced by queer, female, non-white, and other marginalized users without removing the content — a move colloquially called “shadowbanning” — have cropped up within the last year.
Many of Instagram’s queer users state that their engagement has been affected. Instagram and Facebook commonly employ techniques like flagging and removal of posts, or threaten to permanently delete or ban profiles that break ambiguous “community guidelines.” Anecdotally, queer users state that their audience engagement levels have dipped significantly without explanation; that despite having multiple thousands of engaged audience members, viewership typically stagnates at less than 10% when measured through Instagram’s Insights tool.
An artist who requested anonymity and uses Instagram as a promotional tool for selling her work states: “This censorship thing is [bad] and I get no views on anything … I’m lucky if I get over 300 likes on a post . . . I have over 55k [followers]! It’s outrageous. I get like 1-3k VIEWS on posts now vs the 13-20k I used to get . . . I wouldn’t care as much if it wasn’t really affecting my income hardcore.” When asked why their content may have been suppressed, they state: “P[retty] sure I’m shadow banned b/c I use the word d*ke all the time.” While the artist’s comment provides some insight, Instagram’s policy on the use of slurs or charged language is subjective and ambiguous, and does little to inform this discourse:
We remove content that contains credible threats or hate speech . . . When hate speech is being shared to challenge it or to raise awareness, we may allow it. In those instances, we ask that you express your intent clearly.

In October 2019, Salty — a digital newsletter that describes itself as a “100% independent, volunteer ran newsletter, boyed by the voices of women, trans, and non binary contributors from all over the world” — released an investigation into algorithm bias on Instagram, collecting data from 118 survey respondents who Salty claims predominantly identify as “LGBTQIA+, people of color, plus-sized, and sex workers or educators.” According to a chart produced by Salty, roughly 67% identified as LGBTQIA+. Of the 118 respondents, 37% indicated “community guidelines” violations as the reason given for content removal. Of the 118 respondents, 16% claim no reason was provided for the removal of their content. In addition to quantified data, Salty included commentary from respondents who felt their content was unfairly removed or censored, many stating that Instagram’s rules were unclear, ambiguous, “met with silence” upon inquiry, and bigoted.
Neither Facebook nor Instagram have confirmed the existence of a tactic explicitly termed “shadowbanning,” but have released statements that imply that content can be suppressed even if it does not violate community guidelines.
Facebook and Instagram have provided no official resolutions, and requests for community engagement with the corporation were issued to Facebook by Salty. As of the publication of Salty’s algorithmic bias study in October 2019, Salty states: “Facebook [has] shown no indication that they plan (or ever planned) on actually meeting with us to discuss policy development.” Salty, however, intends to continue addressing this issue on behalf of the community which continues to face content suppression.
Suppression of media and representative content is a familiar battle for librarians. Libraries face constant challenges surrounding “appropriate” material in their collections. As with all challenges, it is critical that librarians and other information professionals continue to push back against censorship and other methods of restricting information. Addressing this is multifold.
In the face of information suppression, awareness is crucial. The publication of information like this ensures that issues faced by queer and other marginalized people do not go unnoticed or underreported. Content suppression, like other forms of censorship, must be recognized, as they are frequently used tools that disenfranchise already marginalized people. By sharing this news, librarians help people affected by this issue understand that their experience is not isolated.
This in turn helps catalyze change. Users create quantifiable metrics, data, and statistics that better illustrate both the problem and its impact upon them. Further, they are creating expanding networks working together to brainstorm different platforms and services that better serve them.
Libraries with the resources and capacity can provide spaces that allow for content to flow easier from content creator to various publics. Digital and physical spaces that destigmatize queer content and invite users to promote their social and artistic practices can serve as repositories for publicizing awareness. Additionally, programming that centers queer content and events can provide alternative publicity and physical sites for creators to actively engage their audiences. Librarians can continue to push their institutional collection practices to ensure engagement and representation of disenfranchised voices.
By not only making known issues of content suppression, but creating opportunities for disenfranchised voices to flourish in community, print, and digital spaces, librarians can play a role in challenging censorship.

Kevin Talmer Whiteneir Jr. is an interdisciplinary artist and art historian whose work discusses the relationships between gender and queer experiences as they relate to race, the effects of (neo)colonialism, and its parallels with magic, religion, and witchcraft. Whiteneir holds a Master’s Degree from the School of the Art Institute of Chicago in Modern and Contemporary Art History, Theory, and Criticism and is a Master of Library and Information Science candidate at the University of Illinois Urbana Champaign.
In his work, Whiteneir addresses nuanced issues of social and institutional inequity and works to provide a platform for marginalized and disenfranchised voices. His art historical scholarship often focuses on relationships between gender and queerness as they relate to religion, spirituality, race, and (neo)colonialism in national and international contexts. Whiteneir often threads this into his work illuminating practices of censorship and marginalization in galleries, libraries, archives, and museums. He is experienced in collections and archival management, institutional practices in diversity, inclusion, and equity, and varied genres of contemporary art history.
For more information visit: queerheresies.com