Composing Algorithms: Writing (with) Rhetorical Machines

deadline for submissions: 
October 31, 2018
full name / name of organization: 
Aaron Beveridge / University of North Carolina at Greensboro
contact email: 

CALL FOR PROPOSALS – Special Issue of Computers and Composition
Composing Algorithms: Writing (with) Rhetorical Machines

Guest Editors: Aaron Beveridge (UNC-Greensboro), Sergio C. Figueiredo (Kennesaw State University),
and Steven K. Holmes (George Mason University)

This special issue of Computers and Composition opens a conversation into the importance and centrality
of computer algorithms in digital writing environments. As an extension of the work already underway in
procedural rhetorics and software studies, exemplified in scholarship such as James J. Brown Jr’s Ethical
Programs and Wendy Chun’s Programmed Visions, this special issue prompts researchers to examine not
“The Algorithm” in composition studies but specific network affects/effects for the many algorithms that
already reconfigure how students invent, write, deliver, and receive meaning in digital networks. As
Brown Jr. explains, “Machinic understandings of narratives and arguments allow us to gain insight into
the robot writers that have joined our networked conversations and also present us with strategies for
mediating the worldviews of narrative and database” (“Introduction: The Swarm”). Updating Kenneth
Burke’s mantra, we might now declare, "Wherever there is algorithm, there is rhetoric."

Algorithms offer complications for many of our models of digital writing. While scholarship in digital
and visual rhetoric challenge “instrumental” definitions of rhetoric and writing (Gries, “Mapping Obama
Hope”) to include the ways in which digital rhetoric is circulated and remixed online (Ridolfo and
Devoss, “Rhetorical Velocity”), more recent work interrogates the very platforms and systems themselves
(Edwards and Gelms, “The Rhetorics of Platforms”). Just as there is a broad diversity in the types and
uses of algorithms, there is also a wide range of consequences and possibilities that result from their
mediating function. It is well known that many “socially consequential mechanisms of classification and
ranking, such as spam filters, credit card fraud detection, search engines, news trends, market
segmentation and advertising, insurance or loan qualification, and credit scoring” rely on computational
algorithms. While such algorithms often rely on “machine learning,” they also differ dramatically from
more common algorithms that do not “learn” from network interactions (Burrell, “How the Machine
‘Thinks’”). In what would be an ethical rhetorical heuristic by another name, Jenna Burrell identifies three
different dimensions to algorithmic opacity: “(1) opacity as intentional corporate or state secrecy, (2)
opacity as technical illiteracy, and (3) an opacity that arises from the characteristics of machine learning
algorithms and the scale required to apply them usefully.” Here, we propose adding a dimension of
“algorithmic literacy” not as an external or supplementary role, but as a fundamental aspect of writing and
rhetorical scholarship. For instance, as Kevin Brock and Dawn Shepherd argue in their Winter 2016
Computers and Composition article, “Understanding How Algorithms Work Persuasively Through the
Procedural Enthymeme,” the role of this procedural rhetoric needs to be expanded if composition scholars
“are to realize how complex human-computer rhetor systems function in diverse contexts,” which
persuade “audience agents to action through the apparent logic of a given system.” Similarly, in his Fall
2017 article for Computers and Composition, “Writing for Algorithmic Audiences,” John Gallagher
explores the potential of re-thinking the role of “audience” in writing pedagogy by helping students “write
for audiences beyond the instructor from within the confines of the classroom.”

In many ways, the issues raised when composing with/through algorithms builds on the broad reopening
of methodological possibilities in rhetoric and composition. We need to better understand the problems
posed by algorithmic mediation, but we also need to get involved in making algorithms and studying them
through computational and digital methods. This approach encourages an overlap with more traditional
taxonomies of writing (ethos, pathos, logos, kairos, etc.), and therefore takes up Collin Gifford Brooke’s
claim in Lingua Fracta that the encounter between new media and rhetoric should be “mutually
transformative.” Insofar as algorithms (1) demonstrate the enduring usefulness of our historic
frameworks, in turn, they also (2) provide an opportunity to consider new terminologies and emerging
methodologies. This special issue will attend to both concerns. When digital rhetors engage the
posthuman writing of algorithms new avenues are opened for rhetoric and writing research.

Potential topics include, but are not limited to, the following:
• The relationship among various definitions of ‘the algorithm’ (e.g., formulas, procedures, etc.) and
possible applications of those conceptualizations to literacy.
• The (mis)uses of algorithm-based research (e.g., big data) methods in argumentation, critique,
and/or experimental composition practices.
• New and emerging approaches to rhetorical concepts in light of algorithmic discourse practices
related to composition studies (e.g., computational composition).
• Composition and programming literacy (A. Vee, 2017), including bots, games, etc.
• Algorithmic composition and multimodal writing, including virtual, augmented, and mixed reality
• Computation and poetics (T. Choi, 2017), particularly as it relates to creative (and) professional
• The roles algorithm-based composition play in the practices of contemporary social justice
relations and activism.
• The effects of algorithms on social media feeds and content management, including the increased
attention given to advertising, the role algorithms play in propagating “fake news,” and the
opportunities that algorithms create for “click-farms” and non-human writers (bots).
• How algorithms contribute to the surveillance economy and the continued erosion of privacy
rights (E. Beck, 2015), as well as how public awareness of these issues might affect participation
in public discourse.

Proposals due: October 31, 2018
Preliminary decision on authors: December 1, 2018
Drafts of 6,000-7,000 words due: June 1, 2019
Article Revisions due: December 1, 2019
Publication: 2020

Submission and Contact Details:
Individuals or co-authors should submit a 300-500 word proposal that gives an overview of the piece,
including impetus and focus, and contribution to the field(s). Proposals should be submitted as .doc or
.docx files to Aaron Beveridge (, Sergio Figueiredo (, and
Steven Holmes ( The subject line of the email submission should read “Special
Issue Proposal: Composing Algorithms.” For more information or queries, email Aaron Beveridge
(, Sergio Figueiredo (, and Steven Holmes