Fake news–the buzzword that makes composition teachers shudder–seems to be taking over our digital spaces. The prevalence of fake news, and the ease by which it is shared on social media, make digital literacy more important now than ever.
Buzzfeed reported that during the months of August through November 2016, there was a larger engagement on social media with fake news stories, as compared to engagement with mainstream news stories (Silverman, 2016). A 2016 Pew Research poll found that 64% of Americans say that fake news is becoming a problem, causing “a great deal of confusion” (Barthel, Mitchell, & Holcomb, 2016).
While this information is frightening enough, our shock is compounded with the news that our students—our “digital natives”—are not immune. According to a 2016 Stanford study of student ability to evaluate and reason about information found online, students at all grade levels were found to have a “stunning and dismaying consistency” to be unable to determine the veracity of an online source. Head researchers Wineburg and McGrew call their study’s findings “bleak” (Stanford History Education Group, 2016).
Though the skill of evaluating print and online sources is a common subject for first-year writing courses, our methods are often dated and primitive, particularly in terms of digital media evaluation. Our textbooks tell students that they can simply look at (1) the site’s domain, (2) the “About” page, (3) the date it was last updated, (4) if it contains a “Contact Us” link, and (5) if it refrains from excessive exclamation marks and capitalization, to determine if a website is valid.
These criteria are covered in first-year writing courses with such thoroughness that even my current students can easily reiterate them when asked what they know about evaluating online sources. However, these same criteria are what led my students astray a few years ago, determining a webpage I supplied to them in class was accurate, when in fact it was a page created by the white supremacist group StormFront regarding Holocaust denial. I was shocked by my students’ seeming blindness to the facts, but upon further reflection, I know that these criteria are nowhere near enough.
What can teachers do to improve our methods in training students to be critical consumers of digital media? The first, discussed by Wineburg and McGrew (2016), is to avoid making assumptions about our students’ abilities online. The term “digital native” is misleading, as students are not naturally internet-savvy, though they have grown up in the Internet age. The authors suggest taking tips from professional fact-checkers when teaching students how to become better investigators of online content.
While the Stanford study found that students tend to read vertically on a site, reading it as if it were a printed text, fact-checkers read “laterally” by opening new tabs and investigating the author, sponsor, and other links from the page. Students frequently use the “About” page as their only method of learning about the site creators, though fact-checkers ignore that completely, preferring to cross-check content on other sites.
Additionally, students often choose a site from Google search results due to its placement (links at the top of a result page are erroneously considered more credible), while fact-checkers ignore the order of sites in a Google search result (Wineburg & McGrew).
In ENGL 17000 Research & Argumentation, a second-semester first-year writing course, I start our study of media literacy by asking students to rank the following URLs in order of authority:
- http://www.sitename.com.co (“The credibility challenge”)
From this exercise, we begin to discuss our understanding about the different purposes of websites, and how websites can appear credible, but in fact have an agenda that impacts the quality of the content.
From here, students can see that simply looking at the site’s “About” or “Contact Us” page is insufficient. We then discuss the validity of Google search results, and do a practice search on the projector screen to illustrate that the top results are not necessarily the “best” results.
I then ask students to get into groups and view selected websites, as delivered through a page on our University’s course management system, Canvas. I ask students to read through each link carefully, determining if the link is (1) fake information, trying to mislead the reader; (2) biased information, trying to push an agenda on the reader while not giving accurate representation to alternate perspectives; (3) misleading information, not using sufficient information to support the point being made, leaving its validity unclear; (4) satirical information, trying to use humor to make a point about a topic; or (5) valid information, using sufficient factual information and representing all perspectives accurately.
For students to do this successfully, I ask them to consider the following:
- Assess your emotional reaction. Did the source make you angry? Did it make you hope it was true? Often, sources that elicit powerful emotional reactions are questionable.
- Consider the source. What information can you find out—on other sites—about the author? The sponsoring organization? The sources cited by and linked from the site? Avoid looking just at the “About” page, as anyone can make their “About” page sound credible.
- Cross-check information. Can you find this same information on other sites, reported in the same way? If so, then it is likely valid. If no other site is reporting on this information, or if other sites have a very different perspective, this site is questionable.
- Use a fact-checking website. Find out what professional fact-checkers say about this information. Use any of the following sites:
- Other fact-checking sites run by professional journalist organizations (NPR, Washington Post, etc.) (“Ten questions for fake news detection“)
Then, in their written response about each source, I ask them to identify what the link is, using the five choices above, and I also ask them to provide at least two pieces of evidence to prove their claim is correct. This asks students to not settle with their gut reaction and instead makes them actively investigate the source.
While I plan to continually update these sources each time I teach this lesson, here are a few sites I have used:
When designing a lesson on fake news detection, it’s important to remember that our students need to be guided in how to use our most powerful information resource. Simply growing up online doesn’t make one savvy in flushing out questionable content; encouraging students to challenge a digital text and ask questions about its validity in several ways can help us train students to be media literate during the age of fake news online.
Barthel, Michael; Mitchell, Amy; & Holcomb, Jesse. (2016, December 16). Many Americans believe fake news is sowing confusion. Pew Research Center. Retrieved 18 Feb. 2017 at http://www.journalism.org/2016/12/15/many-americans-believe-fake-news-is-sowing-confusion/
Silverman, Craig. (2016, November 16). This analysis shows how viral fake election news stories outperformed real news on Facebook. BuzzfeedNews. Retrieved 18 Feb. 2017 at https://www.buzzfeed.com/craigsilverman/viral-fake-election-news-outperformed-real-news-on-facebook?utm_term=.oe71vxQBL#.dr0Rao4m8
Stanford History Education Group. (2016). Evaluating information: The cornerstone of civic online reasoning. Retrieved 18 Feb. 2017 at https://sheg.stanford.edu/upload/V3LessonPlans/Executive%20Summary%2011.21.16.pdf
Ten questions for fake news detection. The News Literacy Project. Retrieved 18 Feb. 2017 at http://www.thenewsliteracyproject.org/sites/default/files/GO-TenQuestionsForFakeNewsFINAL.pdf
Wineburg, Sam, & McGrew, Sarah. (2016, November 1). Why students can’t Google their way to the truth. Education Week. Retrieved 18 Feb. 2017 at http://www.edweek.org/ew/articles/2016/11/02/why-students-cant-google-their-way-to.html