Surveillance. Censorship. Disinformation. Distrust. The information war marches on, whether you’re ready to admit it or not.
Information warfare poses a unique challenge to everyday civilians, who are actively recruited as cybercombatants and also unwittingly serve as PSYOP assets. This post follows “An Invitation to Danger: Perspectives on Intellectual Freedom and Information War” with specific suggestions for safeguarding one’s own mind and seeking clarity in the “fog and friction” of information warfare.
Defend Your Privacy
Psychological operations (PSYOPs) are a strategic element of information warfare. PSYOPs seek to influence the thoughts, beliefs, and behaviors of their targets through the directed dissemination of information and propaganda. Recent research on PSYOP techniques include geofencing, in which propaganda is targeted based on locational data, neurocognitive hacking, the use of targeted subliminal triggers to induce state anxiety, and the broader approach of sentiment manipulation, in which curated feeds and notifications are used to influence people’s emotional states.
“The more that’s known about you, the softer a target you are.”
In our four-part Creative Commons-licensed Privacy Workshop Series, my collaborator Alexandria Chisholm and I explore the positive case for privacy in the human experience. In the Digital Wellness Workshop, we take a closer look at attention engineering through persuasive design, sentiment manipulation, and destructive behaviors like doomscrolling, while detailing strategies to defend privacy for personal wellbeing across the lifespan.
Practice Ladder Reading, Not Just Lateral Reading
Lateral reading, the practice of using third-party information to verify claims in a source, emerged as a strategy to detect fake news following the 2016 US election. As a complement to vertical reading – reading a source from beginning to end – lateral reading strategies seek to validate sources and establish consensus among additional news stories, fact checkers, and reference sites.
Lateral reading alone does not offer sufficient protection from information warfare. First, contemporary journalism practices like news aggregation contribute to a false sense of consensus, as multiple news platforms carry the same story while contributing little-to-no original reporting or verification. Furthermore, authoritative information sources like news media are themselves targeted as critical assets in information war.
Second, as fact checkers wade into a polarized political terrain, they are critiqued as ideologically biased by both liberals and conservatives. And third, the go-to open-access reference source, Wikipedia, has been the target of manipulation campaigns ranging from low-key trolling to information suppression by powerful corporate and political entities.
There are no short-cuts to sound information-seeking; like protecting your privacy, this is a dynamic cat-and-mouse game between those working for truth, transparency, and accountability, and those working to create confusion, division, and chaos. The advice I offer my students is,
“Never let anything come between you and the primary source.”
Practice ladder reading, not just lateral reading – navigating upstream from the news report to the primary interview recording, transcript, event livestream, original research, or government document. This approach is directly inspired by the Society of Professional Journalists own Code of Ethics, which details steps journalists should take to “seek truth and report it.”
“Be a Fox, Not a Hedgehog”
Writing for Heterodox Blog, sociologist Ilana Redstone recalls an observation by Greek philosopher Archilochus, later popularized by Isaiah Berlin:
“a fox knows many things, but a hedgehog one big thing.”
Redstone describes the fox-like attribute of open-mindedness, including the intellectual humility to recognize the limits of one’s own knowledge, the intellectual diligence to seek out additional information and opposing viewpoints, and the intellectual courage to change one’s mind in the face of indisputable facts.
In her 2019 paper for the FIRE Conference, Redstone suggests a simple strategy for mastering many ways of knowing:
“explicitly distinguish interpretations from the truth.”
As advocates for intellectual freedom, library workers have dual allegiance to the provision of reliable information sources, and to the protection of individuals’ rights to freely interpret those sources. In the context of information warfare, preserving a contested sphere of interpretation is more protective of the truth than yielding to powerful information authorities who seek to manipulate the knowledge record through suppression, censorship, and propaganda.
But how should we know when to change our minds? Be alert for symptoms of cognitive dissonance – a sense of discomfort, unease, or frustration that occurs when our deeply held thoughts, values, and beliefs are challenged by new information.
It is tempting to respond to cognitive dissonance through coping strategies, like avoidance, denial, rationalizing or compartmentalizing, that allow us to maintain our existing beliefs despite new evidence. But a more information warfare-resilient approach to restoring cognitive consonance seeks to understand that new information, with the willingness to change one’s mind. Practice the Philip K. Dick reality check:
“Reality is that which, when you stop believing in it, doesn’t go away.”
Ask the Right Questions
In his genre-defying novel, Gravity’s Rainbow, Thomas Pynchon puts forth “Proverbs for Paranoids,” including:
“If they can get you to ask the wrong questions, they don’t have to worry about the answers.”
Among the most effective, and hardest to detect, information warfare tactics is to simply omit something from the record. This is why a robust fourth estate, protected both by First Amendment law and by a popular culture of free speech, is so critical to holding the powerful to account.
The ‘right’ questions are often relative, but some universal inquiries worth considering are:
- What do I believe or value deeply about this issue, and how might that affect the way I receive this information?
- What evidence would convince me this is true (or untrue)?
- What evidence, perspective, or analysis is missing?
- Is this a cause-and-effect relationship, or a correlation? What alternate explanations could account for this?
- What rhetorical strategies are being used? Am I being persuaded with evidence and reasoning, or manipulated with logical fallacies?
- Am I being called to action? If so, is that action consistent with my values?
- Am I willing to change my mind? If not, can I accept that I might be wrong?
Take a Trust Pause
In her book, Who Can You Trust? How Technology Brought Us Together, and Why it Might Drive Us Apart, Rachel Botsman addresses the rise of misinformation and decline of trust in institutions by advocating a trust pause:
“Is this person, product, company or piece of information worthy of my trust?”
In life as in information warfare, this question is always worth asking.
Sarah Hartman-Caverly, MS(LIS), MSIS, is a reference and instruction librarian at Penn State Berks, where she liaises with Engineering, Business and Computing programs. Prior to her current appointment, Sarah was a reference and instruction librarian at a community college, and was an electronic resources manager and library system administrator in both community and small liberal arts college settings. Sarah’s research examines the compatibility of human and machine autonomy from the perspective of intellectual freedom. Recent contributions include “Version Control” (ACRL 2017), “Our ‘Special Obligation’: Library Assessment, Learning Analytics, and Intellectual Freedom” (ACRL 2018), and “Human Nature is Not a Machine: On Liberty, Attention Engineering, and Learning Analytics” (Library Trends, 2019). She earned her MS(LIS) and MSIS from Drexel University in 2011.