Deprecated: Hook the_seo_framework_generated_archive_title is deprecated since version 5.0.0 of The SEO Framework! Use the_seo_framework_generated_archive_title_items instead. in /home/drcprod/public_html/wp-includes/functions.php on line 6031

Deprecated: Hook the_seo_framework_generated_archive_title is deprecated since version 5.0.0 of The SEO Framework! Use the_seo_framework_generated_archive_title_items instead. in /home/drcprod/public_html/wp-includes/functions.php on line 6031

Deprecated: Hook the_seo_framework_generated_archive_title is deprecated since version 5.0.0 of The SEO Framework! Use the_seo_framework_generated_archive_title_items instead. in /home/drcprod/public_html/wp-includes/functions.php on line 6031

My Algorithmic Community: Considering the Role of Personalizing Algorithms in Online Community Formation

0

On October 31, 2019, I received some of the happiest news I’d received in a while: after a long hiatus, emo rock band My Chemical Romance –the band who guided me through my awkward adolescent years—announced, via Twitter, their plans to reunite. Of course, I immediately Tweeted my excitement and made favoriting and retweeting similar content a top priority on Twitter. I wasn’t the only one. My feed, usually a steady blend of political news and NBA updates, was now dominated by the conversations of excited fans, like myself, coming together to collectively express their excitement over the news

While I appreciated bonding with other MCR fans over our memories of their music and the excitement for things to come, I didn’t seek these folks out. By that, I mean that I as the user, did not actively pursue or consciously locate the accounts of other fans. Yet, there they were on my Twitter feed ready to reminisce. They, and the online community they represent, came to me. Or, to be more precise, Twitter’s personalizing algorithms brought them to me.

WHAT ARE ONLINE COMMUNITIES?

The term “community” carries with it a rich and robust history. While it may traditionally invoke visions of a close-knit township where everyone knows everyone, the internet has demonstrated that community isn’t and shouldn’t be restricted to geographical boundaries. Raymond Williams suggests in Keywords that we can understand community as “the quality of holding something in common [or]… a sense of common identify and characteristics” (75). Online communities are not simply built around interest in a particular topic; rather, like communities of the physical space, they are equally bound by certain values and ideologies. This better ensures contributions will be read in a way accepted by community members. Howard Rheingold writes that these values emerge from an observed collective good that the community believes binds all isolated users into community members (xxviii). Discussions, content, and information shared in online communities, then, contribute to that collective good–however the community decides to define that.

WHAT ARE ALGORITHMS?

In a general sense, algorithms can best be explained as “a set of instructions for performing a task or solving a problem” (Gallagher, 25). In the days following the dot-com crash of the late 1990s/early 2000s, the problem was maintaining a website that produced a steady income. In response, sites like Amazon toyed around with installing data tracking software (a.k.a “cookies”) to better understand each individual user and suggest products and services to buy. Using this as a springboard, other websites, including Google, Facebook, Twitter, and YouTube, began selling user-provided data to outside advertisers. In addition to commodifying user data, websites used this data to categorize individuals into what John Cheney-Lippold calls “measurable types.” Ranging in specificity from broad categories like gender, age, and location to more specific categories like political affiliation, family dynamics, and (even) music preferences, measurable types inform what content a personalizing algorithm prioritizes on an individual user’s feed. For social media sites especially, this can greatly impact the accounts and pages a user is encouraged to join or interact with. Recalling to my earlier example, while there were other music related stories emerging around this time, I wasn’t Tweeting about them, so, at no point, was I prompted or invited to participate in the discussions occurring in those online communities. You could say that I was assigned a new measurable type: My Chemical Romance fan. And my Twitter feed was proof.

SHAPING COMMUNITIES ALGORITHMICALLY

Within the presence of personalizing algorithms, users may still have access to communities –meaning that they can likely join the community, assuming it’s open– however, their initial exposure to a community proves limited as algorithms will likely filter out those not aligned with how the algorithm perceives the user. We could even go so far as to suggest that, by the guiding hand of personalizing algorithms, online communities have the potential to transform into, what Eli Pariser calls a “filter bubble” (Pariser). And though these algorithm-designed filter bubbles permit some content while excluding other content, this goes beyond the simple presentation of articles in a trending feed by prioritizing the communities a user may see or engage with. Rune Vejby & D.E. Wittkower even warn that networked technologies, including personalizing algorithms and the filter bubbles they create, “tricks users into [believing]that he controls his own life while it actually involves only virtual control, virtual agency, and virtual community” (107). Like physical communities, online communities too are influenced by factors outside of the community’s control, as well as the control of (potential) members.

WHY THE ALGORITHMICALLY-FORMED COMMUNITY MATTERS

We, as instructors of rhetoric and composition, must address the algorithmic formation of online communities to fully realize social justice pedagogy in the 21st century classroom. Natasha Jones reminds us that social justice pedagogy strives to address, head on in a collaborative setting, how “communication broadly defined can amplify the agency of oppressed people –those who are materially, socially, politically, and/or economically under-resourced” (347). Though she crafts this definition through the lens of technical and professional communication pedagogy, her ideas just as strongly apply to considerations in other composition-based classrooms. We can (and I believe must) view the classroom as a space to critically investigate how our language –used broadly to account for both the individual user and users as a collective body– influence the ways individuals are perceived and ultimately sorted into communities online through the guiding code of personalizing algorithms.

Not only can algorithms complicate the communities a user is prompted to join, but they also limit the other communities (or those the user may have less interest in based on the site’s collected data) a user can see and potentially engage with. Elenore Long emphasizes that in traditional understandings of the term, a community is likely to interact with others when advocating their interests regarding an issue that impacts several communities in some way (9). Algorithms, though, filter out necessary exposure to other existing communities, making it difficult for users to appreciate how something may impact individuals outside of their immediate online community. Rather than a site providing users, as both users and members, with all affected communities regarding shared topic of interest, these sites choose to prioritize communities who the algorithm determines “worth of this attention” from a user (Butera, 209). What proves frustrating regarding the changing nature of community as it takes residence online is that online platforms theoretically provide members “extraordinary and ever-growing opportunities for exposure to diverse points of view, and indeed increased opportunities for shared experiences and substantive discussion” (Sunstein, 214). Not only, then, do users have the opportunity to become members of various online communities but they should be encouraged to put those community-fostered understandings to use when engaging with members of other communities regarding a shared certain topic of interest; however, because of algorithm-imposed limitations, this critical engagement with other communities becomes rather inaccessible.

Algorithms do not operate objectively in isolation. As Safiya Noble (2018) brilliantly observes, personalizing algorithms draw from the aggregate of online behavior and habits provided by all users. This means that content a user sees –from search results on Google to suggested accounts to follow on Twitter—are based, in part, on existing social perceptions. Nobel warns this method of privileging certain content has led to the continuation of perpetuating harmful racial and sexist stereotypes. Noble’s concerns equally apply to the formation of online communities, as the account, pages, and groups suggested to users on most online platforms not only reflect existing biases and understandings but also destructive hegemonic perceptions as well. On one hand, online platforms can (and do) provide spaces for members of underserved, oppressed, and disenfranchised communities, while, on the other, they can foster community among those embracing and perpetuating concerning social and political ideologies.

As rhetoricians and compositionists of the 21st century, many of us constantly work towards thoughtfully examining the intersections of technology, writing and culture. Our students (and us too, let’s not forget) do not assemble or locate online communities on our own. The sites we visit, even if we’re not actively on them at a given time, carefully track our online behaviors to categorize us as users into actionable measurable types that privilege the presence of specific online communities we’re then encouraged to join and engage with. Examining the role personalizing algorithms play in online community formation –also including the discourse practices and general nature of these communities— provides an opportunity for students to more critically consider how lines of code can ultimately reinforce and/or resist hegemonic social constructs as they occur both online and IRL.

 

WORKS CITED

Butera, Michael V. “Gatekeeper, Moderator, Synthesizer.” Facebook and Philosophy: What’s on Your Mind? Edited by D.E. Wittkower. Chicago: Open Court, 2010. Print.

Cheney-Lippold, John. We Are Data: Algorithms and the Making of Our Digital Selves. New York: New York UP, 2017. Print.

Gallagher, John R. “Writing for Algorithmic Audiences.” Computers and Composition. Vol. 45, 2017. Pp. 25-35.

Jones, Natasha N. “The Technical Communicator as Advocate: Integrating a Social Justice Approach in Technical Communication.” Journal of Technical Writing and Communication. Vol. 46, 2016. Pp. 342-361.

Long, Elenore. Community Literacy and the Rhetoric of Local Publics. West Lafayette: Parlor Press, 2008. Print.

Noble, Safiya Umoja. Algorithms of Oppression: How Search Engines Reinforce Racism. New York: New York UP, 2018. Print.

Pariser, Eli. The Filter Bubble: What the Internet is Hiding from You. New York: Penguin Press, 2011. Print.

Rheingold, Howard. The Virtual Community: Homesteading the Virtual Frontier. Cambridge: MIT Press. 1993. Print.

Sunstein, Cass. #Republic: Divided Democracy in the Age of Social Media. Princeton: Princeton UP, 2017. Print.

Vejby, Rune and D.E. Wittkower. Facebook and Philosophy: What’s on Your Mind? Edited by D.E. Wittkower. Chicago: Open Court, 2010. Print.

Williams, Raymond. Keywords: A Vocabulary of Culture and Society. Oxford UP, 1976. Print.

Author

  • Lacy Hope

    Lacy Hope is a PhD Candidate at Washington State University where she examines the intersections of digital technology, public discourse, and capitalism.

Leave A Reply