C&W 2019 Saturday Keynote: “Click *Here* If You Agree: Opting Out of Oppressive Systems”

0
C&W 2019’s Saturday Keynote

As we think about how the technologies we use extract our data, some of us might consider how our browsing in one space might produce a curated advertising experience in another. And as irritating as this experience is, it is only the tip of a much larger and deeper iceberg. Chris Gilliard’s Saturday keynote at Computers & Writing is a sobering and timely reminder that many technologies can utilize our data to create and maintain systems of oppression. For writing teachers looking to use technologies in the classroom, conversations about data extraction have never been more critical, but the realities of data extraction often fly under the radar or we might emphasize the rhetorical opportunities of certain technologies without interrogating the risks and ethical considerations of using them. Even more troubling is the role that extractive technologies can play in education; as Gilliard intimates, if corporations can use data for gatekeeping and filtering purposes, so can universities. As an audience member, I was struck not only by what technology could do with our data but also by what it already does.

Gilliard, a Professor of English at Macomb Community College, begins his keynote with a look at Facebook’s patent for facial recognition. This technology, he explains, can scan customers’ faces and determines if they are worthy to purchase certain items based on metrics such as credit scores and social media scores. To contextualize the oppressive potential of this technology, Gilliard pairs this introduction with a quote from Ta’Nahesi Coates about someone who reacts defensively to an approaching African-American man and compares the patent to the racist practice of locking car doors to secure oneself against a perceived threat. In essence, Facebook’s patent is similarly trying to create a way to secure oneself (or one’s products) against people who are perceived as a threat. For writing teachers who use social media in the classroom, this insight should give them pause to assess the present and future risks they are asking their students to undertake by sharing their data in these spaces. This insight should also give scholars pause to think about the implications of extractive technology in higher education. As Gilliard notes, one university has created a public facial recognition repository based on surveillance video of students walking on campus, which companies and governments can also utilize to improve their own datasets. The implications that such a repository has for profiling, gatekeeping, and standardization are truly frightening.

To these ends, Gilliard asks us to adopt a “guiding ethic” for using technology in the classroom. He begins by asking us to think about what our own guiding ethics are and sharing them with the people at our tables. I found the responses of the attendees compelling; many are considerate about the consequences of technology and were mindful about the most vulnerable of their students. Gilliard also shares his own guiding ethic: “Whenever possible, I don’t ask students to use extractive technologies.” As a writing instructor, I find Gilliard’s statement important for two reasons. First, it demonstrates to students the awareness that technology does something with their data when they use it. Second, and I think more importantly, it gives students a means to opt out if they can. As instructors, we want technology to be useful and rhetorically purposeful for our students, but Gilliard’s keynote invites us to think more deeply about the repercussions of feeding the data beast–especially for students and users who are already marginalized and oppressed.

Gilliard ends his keynote by sharing a short story he wrote, titled “Markus,” about a conference device that has a CFP reader. This short story shows how even academic institutions, many of which strive for equality and inclusion, could use facial recognition or some other form of extractive technology to include certain people while excluding others. It also demonstrates the harm that technology can do in trying to be inclusive, and that our conceptions of technology are deeply entrenched in oppressive systems. Making students informed about how technologies extract their data and giving them the option to opt out (if they can) are indeed important steps instructors should consider, but Gilliard’s keynote also encourages us to have these conversations at all levels of academia and research.

Author

  • Jathan Day

    Jathan Day is the Graduate Administrative and Editorial Associate for the DRC. He is a PhD candidate in the Joint Program in English and Education at the University of Michigan. Jathan's research interests include course management systems, digital literacies, online pedagogy, disability studies, and reading practices.

    View all posts
Leave A Reply