REMINDER: CFP – Intersectional Automations: Robotics, AI, Algorithms, and Equity (Abstracts due 1 April 19)

deadline for submissions: 
April 1, 2019
full name / name of organization: 
Nathan Rambukkana

 

 

Intersectional Automations: Robotics, AI, Algorithms, and Equity

Edited Collection (Abstracts Due 1 April 2019)

(pdf version here: http://complexsingularities.net/cfp-intersectional-automations-edited-collection-abstract-deadline-1-april-19/)

 

This collection will explore a range of situations where robotics, biotechnological enhancement, artificial intelligence (AI), and algorithmic culture collide with intersectional social justice issues, such as race, class, gender, sexuality, ability and citizenship.

Some call it the 4th industrial revolution (Brinded, 2016; Kaplan, 2015). Robots, AI, and algorithms have grown from their early uptake in some industries (such as robots in manufacturing) to an accelerating presence in other spheres ranging from customer service roles (for example, reception, check-outs, food service, driving) to professional and creative roles previously unheard-of and un-thought-of (for example, expert legal and medical systems, automated journalism, musical and artistic production (Kaplan, 2016; Ramalho, 2017; Hirsch, 2017)). The World Economic Forum warns that “this will lead to a net loss of over 5 million jobs in 15 major developed and emerging economies by 2020” (Brinded, 2016), a serious challenge to ethical labour practices, and potential looming crisis leading some to consider alternative societal models—such as Universal Basic Income (Frase, 2016), or a robot tax (Walker, 2017)—to compensate.

Meanwhile, there is marked evidence that robots, AI, biotechnology, and algorithms are becoming in general and over-top of employment roles more integrated in human societies. Human-machine communication (HMC) has moved from an important yet somewhat-marginal field to lodge itself at the centre of societal workings and visions for the future. From autonomous vehicles (Bowles, 2016), to the algorithmic filtering of search results (Noble, 2018) and social media content (Gillespie, 2018), from online harassment and political boosterism via bots (Dewey, 2016; Woolley, Shorey, & Howard, 2018), to sex robots (Levy, 2007; Danaher & MacArthur, 2017), from ubiquitous AI assistants in our homes and smart devices (Guzman, 2019), to wearable tech that tracks and shares our biometric data (Forlano, 2019) and/or extends our biological capacities (Brooks, 2003; Jones, 2019), such technologies are rapidly mapping themselves onto almost every conceivable realm of human experience.

And yet, there is mounting evidence that the creation and programming of robots, AI, and algorithms, being artifacts of human culture, do not escape that context, sometimes carrying into their computational logics, platforms and/or embodiments stereotypes, biases, exclusions, and other forms of privilege. One can think of True Companion’s Roxxxy sex robots that some argue have personality types based on racist and sexist stereotypes of womenhood, for example the Barely-18 “Young Yoko” and resistant “Frigid Farah” that, as Gildea and Richardson (2017) note, seem to fetishize underage girls and sexual assault. Or you could think of the abandoned Amazon HR algorithm which, after being fed years of resumes and hiring decisions, used computational logic to identify traits that that were historically associated with Amazon hiring decisions, with the view of automating part of the hiring process, and encoded a preexisting sexism from the HR data that showed that applicants with work experience or activities that included the word “Women’s,” or who were educated at all-women colleges, were often not hired (Jones, 2018). Finally, one could contemplate how polities using data aggregation and predictive algorithms to manage and make decisions about social programs, resource allocation, or policing can end up targeting and profiling  poor or racialized populations, with occasionally terrifying results—such as any mistake on an online application being interpreted by an automated system as “failure to cooperate” (Eubanks, 2017).

This edited collection will draw an analytical circle around these interconnected and adjacent issues, lending a critical eye to what is at stake due to the automation of aspects of culture. How do equity issues intersect with these fields? Are the pronouncements always already dire, or are there also lines of flight towards more equitable futures in which agentic artefacts and extensions can play an active part? Chapters may address one or multiple equity issues, and submissions that address emergent intersections between them will be given special consideration.

Proposed chapters may address topics such as, but not limited to:

– Algorithmic classism, ableism, racism, and sexism, including issues surrounding content moderation on social media (e.g., Gillespie, 2018), redlining/weblining (e.g., Eubanks, 2017), business (e.g., Jones, 2018), big data (e.g., Ferguson, 2017), or military practices such Google’s controversial Project Maven (e.g., Holt, 2018). Algorithm use to address the same issues.

 – Issues around robotic labour and poverty, Universal Basic Income, robotic utopia/dystopias (e.g., Frase, 2016, Kaplan, 2016).

– Issues around the use of deadly autonomous or semi-autonomous robots by the military or non-state actors, such as work surrounding the Campaign Against Killer Robots (e.g., Anderson & Waxman, 2012; Crootof, 2015; Gregory, 2011; Karppi, Bolen, & Granata, 2016).

– Issues surrounding sex-robotics, teledildonics, VR, and AI sexuality, including stereotypical, and sexist sex robotic “personalities” and embodiments (e.g., Gildea & Richardson, 2017); sex robots based on real people without consent (e.g., Gee, 2017); The Campaign Against Sex Robots (e.g., Richardson, 2015; Danaher, Earp, & Sandberg, 2017); teledildonic / VR steam hacking and consent (e.g., Rambukkana & Gautier, 2017; Belamire, 2016); the interplay between robotic brothels and sex worker rights and protests (e.g., Morrish, 2017; Trayner, 2017; Danaher, Earp, & Sandberg, 2017); bots masquerading as real people on dating sites (e.g., Light, 2016; Karppi, 2018); deepfakes and pornography (e.g., Maras & Alexandrou, 2018). Progressive steps to address such issues or to create new sexual futures.   

– The politics and ethics of the singularity (e.g., Korb & Nicholson, 2012) and the future status of robotic and AI workers with respect to labour, citizenship, and human rights—for example, work on Hansen Robotics’ Sophia as Saudi citizen (e.g., Weller, 2017), robotic servitude (e.g., Green, 2016), as well as the rights of humans interacting with AI (e.g., Shepherd, 2019).

– Assumptions, representations, and discourse surrounding dis/ability and human augmentics, including “supercrip” and “cyborg” discourses and the potential tensions between feminist technoscience (e.g., Haraway, 1990) and critical disability studies (e.g., Allan, 2016; Cascais, 2013).

– How any of these or other issues are depicted in popular or fringe fictions that contain robotic or AI characters (for example, HumansNeuromancer, ExtantWestworldHerBlade RunnerEx MachinaGhost in the ShellAltered CarbonBlack MirrorSpeakNeon Genesis EvangelionQuestionable Content, etc.)

My goal is to assemble a collection of exemplary abstracts and then approach some top-tier academic publishers with relevant series.

If interested, please send a 750-word abstract, collection of keywords, and a 150-word bio to the editor, Dr. Nathan Rambukkana (n_rambukkana@complexsingularities.net), by 1 April 2019. Drafts will be due 1 October 2019 and final versions 1 April 2020. Please also email Nathan at the above address if you have any questions and feel free to repost this CFP to your networks.

References

Allan, K. (2016). Categories of disability in science fiction. Retrieved from http://www.academiceditingcanada.ca/blog/item/317-disability-in-sf-article

Anderson, K., & Waxman, M. (2012, December). Law and ethics of robot soldiers. Policy Review. Columbia Public Law Research Paper No. 12-313.American University, WCL Research Paper No. 2012-32.

Belamire, J. (2016, October 21). My first virtual reality groping. Mic. Retrieved from https://mic.com/articles/157415/my-first-virtual-reality-groping-sexual-assault-in-vr-harassment-in-tech-jordan-belamire#.FubnFVP5F

Bowles, N. (2016, March 1). Google self-driving car collides with bus in California, accident report says. Guardian. Retrieved from https://www.theguardian.com/technology/2016/feb/29/google-self-driving-car-accident-california

Brinded, L. (2016, January 19). WEF: Robots, automation, and AI will replace 5 million human jobs by 2020. Business Insider. Retrieved from https://www.businessinsider.com.au/wef-davos-report-on-robots-replacing-human-jobs-2016-1

Brooks, R. (2003). Flesh and machines: How robots will change us. New York: Vintage.

Cascais, A. F. (2013). The metamorphic body in science fiction: From prosthetic correction to utopian enhancement. In K. Allan K. (Ed.), Disability in Science Fiction (pp. 61–72). New York: Palgrave Macmillan.

Crootof, R. (2015). War, responsibility, and killer robots. North Carolina Journal of International Law and Commercial Regulation, 40(4), 909–932.

Danaher, J., Earp, B. & Sandberg, A. (2017). Should we campaign against sex robots? In J. Danaher & N. MacArthur (Eds.), Robot sex: Social and ethical implications (pp. 47–72). Cambridge, MA: MIT Press.

Danaher, J, & MacArthur, N. (Eds.). (2017). Robot sex: Social and ethical implications. Cambridge, MA: MIT Press.

Dewey, C. (2016, October 19). One in four debate tweets comes from a bot. Here’s how to spot them. Washington Post. Retrieved from https://www.washingtonpost.com/news/the-intersect/wp/2016/10/19/one-in-four-debate-tweets-comes-from-a-bot-heres-how-to-spot-them/?utm_term=.a757c59bc072

Eubanks, V. (2017). Automating equality: How high-tech tools profile, police, and punish the poor. New York: St. Martin’s Press. 

Ferguson, A. G. (2017). The Rise of Big Data Policing. New York: New York University Press.

Forlano, L. (2019). Posthuman futures: Connecting/disconnecting the networked (medical) self. In Z. Papacharissi (Ed.), A networked self and human augmentics, artificial intelligence, sentience (pp. 39–50). New York: Routledge.

Frase, P. (2016). Four futures: Life after capitalism. London: Verso.

Gee, T. B. (2017, April 28). Why female sex robots are more dangerous than you think. Telegraph. Retrieved from https://www.telegraph.co.uk/women/life/female-robots-why-this-scarlett-johansson-bot-is-more-dangerous/

Gildea, F., & Richardson, K. (2017, May 12). Sex robots: Why we should be concerned. Campaign Against Sex Robots. Retrieved from https://campaignagainstsexrobots.org/2017/05/12/sex-robots-why-we-should-be-concerned-by-florence-gildea-and-kathleen-richardson/

Green, S. M. (2016). Bina48: Gender, race, and queer artificial life. Ada: A Journal of Gender, New Media & Technology, 9. Retrieved from https://adanewmedia.org/2016/05/issue9-greene/

Gregory, D. (2011). From a view to a kill drones and late modern war. Theory, Culture & Society, 28(7-8), 188–215. DOI: 10.1177/0263276411423027

Guzman, A. L. (2019). Beyond extraordinary: Theorizing artificial intelligence and the self in daily life. In Z. Papacharissi (Ed.), A networked self and human augmentics, artificial intelligence, sentience (pp. 84–96). New York: Routledge.

Haraway, D. (1990). A manifesto for cyborgs: Science, technology, and socialist feminism in the 1980s. In L. J. Nicholson (Ed.), Feminism/postmodernism (pp. 190–233). New York: Routledge.

Hirsch, P. B. (2017). The robot in the window seat. Journal of Business Strategy, 38 (4), 47–51. Retrieved from https://doi.org/10.1108/JBS-04-2017-0050

Holt, K. (2018, May 14). Google employees reportedly quit over military drone AI project. Engadget. Retrieved from https://www.engadget.com/2018/05/14/google-project-maven-employee-protest/

Jones, R. (2018, October 10). Amazc on's secret AI hiring tool reportedly 'penalized' resumes with the word 'women's'. Gizmodo. Retrieved from https://gizmodo.com/amazons-secret-ai-hiring-tool-reportedly-penalized-resu-1829649346

Jones, S. (2019). Untitled, no. 1 (human augmentics). In Z. Papacharissi (Ed.), A networked self and human augmentics, artificial intelligence, sentience (pp. 201–205). New York: Routledge.

Kaplan, J. (2015, August 23). Robots are coming for your job: We must fix income inequality, volatile job markets now — or face sustained turmoil. Salon. Retrieved from https://www.salon.com/2015/08/23/robots_are_coming_for_your_job_we_must_fix_income_inequality_volatile_job_markets_now_or_face_sustained_turmoil/

Karppi, T. (2018). “How angels are made”: Ashley Madison and the social bot affair. In Z. Papacharissi (Ed.), A networked self and love (pp. 173–188). New York: Routledge.

Karppi, T., Bolen, M., & Granata, Y. (2016, October 16). Killer Robots as cultural techniques. International Journal of Cultural Studies. https://doi.org/10.1177%2F1367877916671425

Korb, K. B., & Nicholson, A. E. (2012, March). Ethics of the singularity. Issues. Retrieved from http://www.issuesmagazine.com.au/article/issue-march-2012/ethics-singularity.html 

Levy, D. (2007). Love + sex with robots : The evolution of human–robot relationships. New York: Harper. 

Light, B. (2016). The rise of speculative robots: Hooking up with the bots of Ashley Madison. First Monday: Peer-Reviewed Journal on the Internet, 6(6). Retrieved from http://journals.uic.edu/ojs/index.php/fm/article/view/6426

Maras, M.,-H. & Alexandrou, A. (2018, October 28). Determining authenticity of video evidence in the age of artificial intelligence and in the wake of Deepfake videos. International Journal of Evidence & Proofhttps://doi.org/10.1177/1365712718807226

Morrish, L. (2017, April 28). A Sex Doll Brothel Is Set To Open In The UK. Konbini. Retrieved from http://www.konbini.com/en/lifestyle/sex-doll-brothel-uk/

Noble, S. U. (2018). Algorithms of oppression: How search engines reinforce racism. New York: NYU Press.

Ramalho, A. (2017). Will robots rule the (artistic) world?: A proposed model for the legal status of creations by artificial intelligence systems. Journal of Internet Law, 21(1), 11–25.

Rambukkana, N., & Gautier, M. (2017). L’adultère à l’ère numérique : Une discussion sur la non/monogamie et le développement des technologies numériques à partir du cas Ashley Madison. Genre, Séxuality et Société, 17. Retrieved from https://journals.openedition.org/gss/3981

Richardson, K. (2015, September 15). Welcome to the Campaign Against Sex Robots. Campaign Against Sex Robots. Retrieved from https://campaignagainstsexrobots.org/2015/09/15/welcome-to-the-campaign-against-sex-robots/

Shepherd, T. (2019). AI, the persona, and rights. In Z. Papacharissi (Ed.), A networked self and human augmentics, artificial intelligence, sentience (pp. 187–200). New York: Routledge.

Trayner, D. (2017, March 17). First sex doll brothel in Europe shut down one month after opening before police raid. Daily Star. Retrieved from https://www.dailystar.co.uk/news/latest-news/597431/lumidolls-europe-first-sex-robot-love-doll-brothel-barelona-spain-closed-shut-down-police

Walker, J. (2017, October 24). Robot tax – A summary of arguments “for” and “against”. Tech Emergence. Retrieved from https://emerj.com/ai-sector-overviews/robot-tax-summary-arguments/

Weller, C. (2017, October 26). A robot that once said it would 'destroy humans' just became the first robot citizen. Business Insider. Retrieved from https://www.businessinsider.com/sophia-robot-citizenship-in-saudi-arabia-the-first-of-its-kind-2017-10

Woolley, S., Shorey, S., & Howard, P. (2018). The bot proxy: Designing automated self expression. In Z. Papacharissi (Ed.), A networked self and platforms, stories, connections (pp. 59–76).  Ed. Z. Papacharissi. New York: Routledge.