Session I-9: Impacts and Challenges in Online Writing Instruction


Presenters: Sarah Young (Erasmus University Rotterdam) and Charles Woods (Illinois State University)

In this session, presenters explore students’ attitudes toward––and expectations for––privacy in online writing instruction (OWI). 

Sarah Young, “Student/User Privacy Expectations and the Online Classroom?”

In her talk, Sarah Young imagines her students as not just learners, but also users who have certain expectations about how technology will or will not be used in their online courses. The possibility of events like Zoombombing in online meetings indicates that, as Young puts it, “the disconnect between privacy expectations and privacy realities can result in privacy crises” that disrupt students’ experiences of online courses. Young thus frames privacy as a “multidimensional practice” linked to “surveillance practices” that cause physical harm. For example, an instructor’s requirement that students keep their cameras on during class meetings might cause distress for students who don’t want classmates or their teacher to see their home spaces. 

Young poses the following questions for attendees to consider: 

Are online spaces more or less private? Are there concerns for recording sessions for students not present? What are concerns for posting online project[s]to spaces like YouTube or other public platforms? What are overall/miscellaneous concerns for online spaces?

She offers the following recommendations, grounded in her experience as an online instructor: 

  • Instructors should rethink their technology policies, especially ones that require students to keep cameras on during synchronous class sessions.
  • Instructors should be wary of exam proctoring software, which surveils students through microphone and camera functions and alerts instructors to student movements that the software marks as indicative of potential cheating behavior. While Young doesn’t mention this, this software is also more likely to flag and report the movements of BIPOC, transgender/gender nonconforming, and disabled students (see “Anti-Cheating Software Drawing Criticism at Universities” and “Software that Monitors Students During Tests Perpetuates Inequality and Violates their Privacy” for more). 
  • If instructors are recording lectures, students may prefer those lectures being shared internally (i.e., on the class LMS) instead of on a public forum like YouTube, especially if students’ images are in the background.
  • Instructors should develop alternatives that don’t require students to post their projects on public forums.
  • Instructors should tell students where their work will be stored on the LMS, for how long, etc.
  • Instructors should teach students about how they can protect their privacy in their online courses. For example, instructors can tell students how to blur their backgrounds when they’re on camera.

In Young’s experience, students are more concerned about the personal information that technology companies can access about them online––and not as concerned about what teachers can learn about their online behaviors. Nevertheless, she suggests that teachers should be transparent and tell students what kinds of information they can access about students through the LMS (e.g., the amount of time students are spending in the LMS, the date of students’ last access, etc.). 

Charles Woods, “Interrogating Digital Rhetorical Privacy Using DTC-Genetics Privacy Policies”

In his talk, Charles Woods argues that privacy is a rhetorical construct––and one that is worth having students study, too. Woods describes how he teaches students about “digital rhetorical privacy (DRP)” using direct-to-consumer genetics testing (DTC genetics) privacy policies. He defines DRP as a “state of being when a user is confid[e]nt their digital data is free from unauthorized observances by nefarious computer technologies and other users.” To Woods, DTC genetics privacy policies present opportunities for thinking about “the relationship between digital privacy and power.” These policies function to obscure what companies do with users’ data (including sharing users’ data with law enforcement to help solve crimes) as well as the relationships these companies have with religious groups and other organizations. 

Building on the work of scholars like Estee Beck and sociologist Gary T. Marx, Woods describes six privacy-related concepts, all of which he sees as rhetoric/al. For several of these concepts, Woods offers questions for analysts (students and instructors) of privacy policies to consider. In the table below, I have listed these six concepts, as well as corresponding questions for analysis. (Quotations indicate questions posed verbatim by Woods in his presentation.)

ConceptQuestions for Analysis
Temporality“When was the privacy policy published? When was the last time the policy was updated? What technological or policy changes have occurred in our society that now supersede the privacy policy?”
Transparency“What methods of data collection are inaccessible to users? What relationships does the company that publishes the privacy maintain with other companies, and state actors, and government entities? What are the potential trajectories of user data once it’s collected?”
Language/rhetoric“What rhetorical arguments are formed by the policy? How are rhetorical appeals incorporated in the document? What complex jargon is used and why? And what rhetorical implications do these words and concepts [have]for users and non-users?” 
Digital surveillanceWhat information is collected about users? Where/how is it stored?
Data usageWho can use users’ data? What entities (corporate, etc.) profit from users’ data? Where can/does users’ data travel? With what consequences? 
Meaningful accessHow do “oppressive strategies extend beyond actions online and into users’ everyday lives?” How can privacy policies be revised to “help users better understand how power is distributed?” 

After discussing these concepts and questions, and sharing excerpts from DTC genetics privacy policies, Woods then describes two potential activities for students: 

  • Students might “remediate” particular privacy policies. He recommends that students first attend to the genre conventions of privacy policy documents and their impact on (non)users before noting the changes to the document they intend to make to “make the policy more transparent and readable.” Woods mentions that reading these lengthy privacy documents is at first a “daunting” task for a lot of students, but that they end up finding it worthwhile. 
  • Students might write a public-facing argument (for a blog or campus publication) based on their analyses of these privacy policies. 


Taken together, these presentations offer three key takeaways for online instructors: 

  • Privacy is a multidimensional rhetorical concept with material consequences. 
  • Teachers have a responsibility to protect students’ privacy in their online courses and to inform students about how to protect their privacy online. 
  • Privacy policies present opportunities for students and teachers to think through the relationship between access, power, and user experience.

Readers who are interested in learning more about privacy and online instruction might find this new book helpful for thinking through this issue: Privacy Matters: Conversations about Surveillance within and beyond the Classroom (2021)

This session is available for viewing in the 2021 CCCC conference portal through May 30, 2021.

About Author

Charlotte Asmuth

Charlotte Asmuth is a PhD candidate in Rhetoric and Composition at the University of Louisville. Their dissertation explores how modes and standards of assessment transform as they move and are mobilized by different institutional and other entities.

Leave A Reply