Link Search Menu Expand Document

Calendar

The class schedule is detailed below, and is subject to change. We will communicate any changes in class.

All readings are required, unless they are tagged with Optional.

Reading responses are also required, unless explicitly noted.

Reading responses are due by 6pm the day before class.

Milestones are due before 9am on the morning of the class.

Sept 30: Overview and Groundwork

No readings or responses

Oct 5: Ethical Foundations and Tools

Quinn, M. J. (2017). Ethics for the information age. Pearson. (Section 2.1.-2.11)

No reading responses

Oct 7: Ethic Codes

ACM Code of Ethics and Professional Conduct (skim)

“Be Careful What You Code For”, Danah Boyd (2016)

Washington, A. L., & Kuo, R. (2020, January). Whose side are ethics codes on? power, responsibility and the social good. In Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency (pp. 230-240).

Optional “Are we having an ethical crisis in computing?” by Moshe Y. Vardi, Communications of the ACM, 62(1), 7-7, 2019.

Optional Bietti, E. (2020, January). From ethics washing to ethics bashing: a view on tech ethics from within moral philosophy. In Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency (pp. 210-219).

Oct 12: Idea Fair

Milestone Idea Fair

No readings or responses, but get started on those for Thursday!

Oct 14: Data Politics

Winner, L. (1980). Do artifacts have politics?. Daedalus. (15 pages)

Green, B. (2020). Data science as political action: grounding data science in a politics of justice. Available at SSRN 3658431. (~20 pages)

Parvin, N., & Pollock, A. (2020). Unintended by Design: On the Political Uses of “Unintended Consequences”. Engaging Science, Technology, and Society, 6, 320-327. (7 pages)

Optional Green, B., & Chen, Y. (2021). Algorithmic risk assessments can alter human decision-making processes in high-stakes government contexts. Proc. ACM Hum.-Comput. Interact. 5, CSCW2, Article 418. (33 pages)

Oct 19: Environment

Milestone Project Proposal

“Anatomy of an AI System” by Kate Crawford et al., 2018 (14 pages)

Schwartz, R., Dodge, J., Smith, N. A., & Etzioni, O. (2019). Green AI. arXiv preprint arXiv:1907.10597. (9 pages)

Optional Borning, A., Friedman, B., & Logler, N. (2020). The ‘invisible’ materiality of information technology. Communications of the ACM, 63(6), 57-64.

Optional Rolnick, D., Donti, P. L., Kaack, L. H., Kochanski, K., Lacoste, A., Sankaran, K., … & Luccioni, A. (2019). Tackling climate change with machine learning. arXiv preprint arXiv:1906.05433.

Optional “Open letter to Jeff Bezos and the Amazon Board of Directors” by Amazon Employees for Climate Justice, 2019

Optional “The Cloud is Not the Territory” by Ingrid Burrington, 2014

Oct 21: Feminism and Power

D’Ignazio, C., & Klein, L. F. (2020). Data feminism. MIT Press. (Read Introduction and The Power Chapter)

“Technically Female: Women, Machines, and Hyperemployment” by Helen Hester, 2016 (10 pages)

Oct 26: Postcolonial Computing and Technological Solutionism

Guest Speaker Kentaro Toyama (author of Geek Heresy)

Irani, L., Vertesi, J., Dourish, P., Philip, K., & Grinter, R. E. (2010, April). Postcolonial computing: a lens on design and development. In Proceedings of the SIGCHI conference on human factors in computing systems (pp. 1311-1320).

Kentaro Toyama (2015). Geek Heresy: Rescuing Social Change from the Cult of Technology, Chapter 2. Public Affairs.

Optional The Wubi Effect – a Radio Lab episode on how Chinese characters didn’t fit on a keyboard

Optional Nigini Oliveira, Michael Muller, Nazareno Andrade, and Katharina Reinecke, “The Exchange in StackExchange: Divergences between Stack Overflow and its Culturally Diverse Participants”, Proceedings of ACM Conference on Computer Supported Cooperative Work and Social Computing (CSCW), 2018. (read intro and discussion if you’re short on time)

Optional Hong Shen, Cori Faklaris, Haojian Jin, Laura Dabbish, and Jason I. Hong. 2020. ‘I Can’t Even Buy Apples If I Don’t Use Mobile Pay?’: When Mobile Payments Become Infrastructural in China. Proc. ACM Hum.-Comput. Interact. 4, CSCW2.

Oct 28: Race

Milestone Methods Section (or similar)

Benjamin, R. (2019). Race after technology: Abolitionist tools for the new jim code. Social Forces. (read pages 1-17)

Ogbonnaya-Ogburu, I. F., Smith, A. D., To, A., & Toyama, K. (2020, April). Critical Race Theory for HCI. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (pp. 1-16).

Field, A., Blodgett, S. L., Waseem, Z., & Tsvetkov, Y. (2021). A Survey of Race, Racism, and Anti-Racism in NLP. arXiv preprint arXiv:2106.11410.

Optional Hankerson, D., Marshall, A. R., Booker, J., El Mimouni, H., Walker, I., & Rode, J. A. (2016, May). Does technology have race?. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems (pp. 473-486).

Optional Schlesinger, A., O’Hara, K. P., & Taylor, A. S. (2018, April). Let’s talk about race: Identity, chatbots, and AI. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (pp. 1-14).

Nov 2: Exclusive and Evil by Design

“Can you make an AI that isn’t Ableist?” by Shari Trewin, MIT Technology Review

Gray, C. M., Kou, Y., Battles, B., Hoggatt, J., & Toombs, A. L. (2018, April). The dark (patterns) side of UX design. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (pp. 1-14).

Optional Gray, C. M., Chivukula, S. S., & Lee, A. (2020, July). What Kind of Work Do” Asshole Designers” Create? Describing Properties of Ethical Concern on Reddit. In Proceedings of the 2020 ACM Designing Interactive Systems Conference (pp. 61-73).

Optional Dark Patterns case study in The ACM Ethics Code.

Nov 4: Mis-/Disinformation or Free Speech, Platforms or Publisher?

Gillespie, T. (2018). Custodians of the Internet: Platforms, content moderation, and the hidden decisions that shape social media. Yale University Press. (read chapter 1, p. 1 - 24)

“Everything You Need to Know About Section 230”, The Verge, 2020

Bruckman, Amy. “Should you believe Wikipedia?” Chapter from the forthcoming Should You Believe Wikipedia? from Cambridge University Press. (17 pages)

Optional “Burnout, splinter factions and deleted posts: Unpaid online moderators struggle to manage divided communities” by Heather Kelley, The Washington Post, 2020

Optional “Blue Feed, Red Feed” by Jon Keegan, 2016

Optional “Why Facebook can’t fix itself” by Andrew Marantz, The New Yorker, 2020

Nov 9: Privacy

Acquisti, A., Brandimarte, L., & Loewenstein, G. (2015). Privacy and human behavior in the age of information. Science, 347(6221), 509–514.

“It’s Not Privacy, and It’s Not Fair” by Cynthia Dwork et al., 2013

“Scroogled” by Cory Doctorow

Optional “Think You’re Discreet Online? Think Again” by Zeynep Tufekci, The New York Times, 2019

Nov 11: Veteran's Day

No class!

Nov 16: Project Fair Round 1

Milestone Project Fair Round 1

No readings or responses, but get started on those for Thursday!

Nov 18: Data Collection and Crowdsourcing

Jo, E. S., & Gebru, T. (2020, January). Lessons from archives: Strategies for collecting sociocultural data in machine learning. In Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency (pp. 306-316).

Barbosa, N. M., & Chen, M. (2019, May). Rehumanized crowdsourcing: a labeling framework addressing bias and ethics in machine learning. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (pp. 1-12).

Optional Kittur, A., Nickerson, J. V., Bernstein, M., Gerber, E., Shaw, A., Zimmerman, J., … & Horton, J. (2013, February). The future of crowd work. In CSCW 2013.

Nov 23: Classification Bias

Buolamwini, J., & Gebru, T. (2018, January). Gender shades: Intersectional accuracy disparities in commercial gender classification. In Conference on fairness, accountability and transparency (pp. 77-91).

“Do algorithms reveal sexual orientation or just expose our stereotypes?” by Blaise Aguera y Arcas et al., 2018

Crawford, K. (2019). Regulate facial-recognition technology. Nature, 572(7771), 565-565. (1 page)

Optional “Excavating AI: The Politics of Training Sets for Machine Learning” by Kate Crawford et al., 2019 (14 pages)

Nov 25: Thanksgiving

No class, Happy Thanksgiving!

Nov 30: Accountability in AI

Guest Speaker Sabelo Mhlambi on global south perspectives in AI policy and on Ubuntu Ethics.

Diakopoulos, Nicholas, Sorelle Friedler, Marcelo Arenas, Solon Barocas, Michael Hay, Bill Howe, Hosagrahar Visvesvaraya Jagadish et al. Principles for accountable algorithms and a social impact statement for algorithms. FAT/ML (2017).

Madaio, M. A., Stark, L., Wortman Vaughan, J., & Wallach, H. (2020, April). Co-designing checklists to understand organizational challenges and opportunities around fairness in AI. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (pp. 1-14).

Elish, Madeleine Clare. “Moral crumple zones: Cautionary tales in human-robot interaction.” Engaging Science, Technology, and Society 5 (2019): 40-60.

Optional Computer says no: why making AIs fair, accountable and transparent is crucial by Ian Sample, The Guardian, 2017

Optional From Rationality to Relationality: Ubuntu as an Ethical & Human Rights Framework for Artifical Intelligence Governance by Sabelo Mhlambi, 2020

Dec 2: Roles and Responsibilities

Abebe, R., Barocas, S., Kleinberg, J., Levy, K., Raghavan, M., & Robinson, D. G. (2020, January). Roles for computing in social change. In Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency (pp. 252-260).

Bruckman, A. (2020). ‘Have you thought about…’ talking about ethical implications of research. Communications of the ACM, 63(9), 38-40.

Optional “Why Stanford Researchers Tried to Create a ‘Gaydar’ Machine”, by Heather Murphy, The New York Times, 2017

Dec 7: Project Fair Round 2

Presentations + last thought exercise. No readings or responses

Dec 9: Project Fair Round 2

Milestone Project Fair Round 2

Presentations + Wrap up. No readings or responses