By: Rebecca Hill
An Interview with Sarah McGrew, Co-Director, Civic Online Reasoning Project, Stanford University
If you think that students are proficient in evaluating online information just because they are always on the Internet and social media, you’d be dead wrong. In fact, says Sarah McGrew, Co-Director of the Civic Online Reasoning Project, they are not that good at it.
A part of Stanford’s History Education Group, the 2016 study conducted by the Project found that students exhibited poor ability to evaluate online information. Since 2016, SHEG has continued doing assessments and pilots, measuring student evaluation performance.
In the 2016 study and since, middle school, high school, and college students were asked to 1) identify who’s behind the online information presented 2) evaluate the information presented and 3) investigate what other sources say about that information. In all tasks, students failed to evaluate the online information submitted. When middle school students were asked to identify a traditional ad from a news story, more than 80% believed the ad identified as “sponsored content” was the real news story.
In another instance, students were asked to investigate multiple sources to verify the claim from minimuwage.com that stated: “paying American workers more would result in increased food prices and unemployment.” Students could use any online source to verify if the website was a reliable source. Fifty-eight college students and ninety-five AP U.S. History students completed this task with only 6% of college students and 9% of high school students identifying the source of the claim. A majority accepted the website as reliable even though it was a “cloaked website,” i.e., a website that conceals its real identity.
Unfortunately, in continuing assessments and pilots, says McGrew, students have not improved. “We continue to see problematic reasoning from students,” said McGrew. “Even though we are still seeing that students are receptive to the strategies that came out of the study.” Why? “Because students are like anyone else. They are not interested in being duped by information online,” says McGrew. What needs to change is how we teach the evaluation process.
Often many teachers use checklists to help students evaluate information. These lists, says McGrew, fail to consider just how sophisticated the web has become and how clever those who use it are. So, to develop student evaluation strategies, McGrew’s group turned to the experts: professional factcheckers. What they found from these professionals was fascinating. Where most students often relied on a search engine like Google to prioritize their search results, then used the first search result on Google’s list, professional factcheckers used a process that McGrew called “lateral reasoning.”
“Factcheckers, when they landed on a site or post that they weren’t familiar with, immediately prioritized who was behind the website or who sponsored it,” said McGrew. “So instead of staying within that website, reading its articles or looking at the site’s page design-something we see students do all the time, factcheckers left the site and opened additional tabs to search the organization itself.” Most factcheckers believed that reading website’s articles were a waste of time. Why? Because at that point, they did not know if the website’s initial source was reliable.
Factcheckers also considered the changing scenery of information and the deception that websites used in design to entice people to click on them and use the information no matter the source. Why? Because “the information landscape has dramatically changed. Our problems have amplified because our political climate is more controversial,” said McGrew. As a result, people engaged in what McGrew called “motivated reasoning,” or reasoning where people believed only those views that aligned with their opinions.
Another aspect of how the landscape has changed is how social media algorithms often prioritize what we see based on what we have viewed in the past. So again, we only know the information with which we agree. The real problem with current strategies is, says McGrew, that Internet technology has progressed too rapidly for education to keep up. As a result, the evaluation must become more strenuous.
Another strategy that professional factcheckers engaged in was “clicker restraint,” as in mouse clicker. Often students assume that search engines like Google sort the most reliable websites at the top of the page, says McGrew. Factcheckers, however, knew that this was not the case. “So, they scrolled down the page for search results often clicking on the second or third page of search results, using the summary snippets to help them determine what site was most authoritative,” said McGrew, “doing all this before they clicked on a link to a site.”
The last strategy that factcheckers employed was using Wikipedia to mine for reliable sources. Often Wikipedia is considered an unreliable source for primary sourcing, but factcheckers used Wikipedia’s reference section where they found more authoritative sources that panned out for researchers.
“Unfortunately, students have told us that when some teachers teach students how to use Wikipedia wisely,” said McGrew, “we have had students also report that other teachers had conflicting viewpoints on it.” So, it creates a dilemma for students. While Wikipedia still tends to be a “sticky wicket,” sourcing from Wikipedia References once evaluated, may be helpful.
In a recent Science Advances study, researchers suggested that the lack of media literacy may explain why fake news was shared so much by social media users over age 65, who shared nearly seven times as many articles from fake news. As a result, the study advocated greater educational efforts “targeted at people in different age groups and with varying level of technological skill” as something that might reduce the spread of online misinformation.
So perhaps a good first step, if you aren’t doing it already, is to start teaching online media literacy for all age levels at the library. Librarians play a critical role by educating students and patrons on strategies to evaluate online information. “An informed citizenry is the bedrock of democracy, and over the short term, online information can impact our decisions and actions,” said McGrew. “They can gradually shape the way we view an issue that can ultimately impact our decision.” We must remember, says McGrew, that the quality of our decisions affects not only ourselves and our community but all of us too.
Right now, McGrew’s team is working on developing lesson plans on online information evaluation for all content areas. Found on their website, these plans will be free to teachers and librarians.
Rebecca Hill is a freelance writer who writes on libraries, literacy, science education and other topics for a variety of online and national magazines. Currently she writes a science education column for VOYA magazine. She hold a MLS from Indiana University Purdue University and JD from Valparaiso University. Her interest in intellectual freedom has been peaked by the increase in technology via artificial intelligence and social media. Currently she serves on the Indiana Library Federation Board of Directors and the Purdue University Libraries Dean’s Council. She is also on the Library Board of Trustees for her local library. A long time advocate of libraries, reading, writing and all things words are her passion.