Point: Tech Ethics Are Needed Now More Than Ever
By Ryan Golemme ’23
When I took CS50: Introduction to Computer Science in the fall of 2020, there was a tacked-on ethics lesson in the final week as a part of Harvard’s Embedded EthiCS program. Though it was merely two people flatly talking over Zoom, and occurred after we had learned all of the actual coding, this was nevertheless a step in the right direction in educating people on the ethics of technology.
Some may exaggerate the effects of the internet on its users, but there’s no denying that there are lasting issues with the design and use of many current computer technologies. The Wall Street Journal’s reports on The Facebook Files in October highlighted the platform’s tendency to spread divisive content. On YouTube, algorithms often arbitrarily favor some videos over others—clickbait at best, and controversy and creepy pregnant Disney Elsa videos at worst. Online mobbing is almost cyclical, where people are accustomed to tagging employers and launching unconfirmed accusations that spread like wildfire. Just this week, Harvard students fell for a scam romantic matching website that got them to reveal their personal opinions and sexual preferences, all as a prank organized by MIT students.
As computing and the internet become increasingly embedded in our daily lives, the power for their designers and users to manipulate the largest online sites only grows.Harvard University, with its prestige and resources, should take the lead in imagining and inspiring improvements in tech ethics.
Through this program, launched in 2017 by Professor of Natural Sciences Barbara Grosz and Professor of Philosophy Alison Simmons, CS students can both learn how to engineer the fastest and most efficient algorithms, and also study the practical consequences of how and what things these algorithms can amplify. Increased focus on data collection can help people think of better ways to balance usefulness while still respecting the dignity and privacy of users. The initial run of the program also focused on other aspects like universal user-design and discriminatory biases in machine learning that can help amend long-standing social injustices. The program focuses on relevant and impactful areas of CS design, and pilot hiccups do not diminish their importance.
In fact, tech ethics lessons would benefit all students, not only CS concentrators. Learning proper online conduct—not spreading potentially false statements, avoiding outrage mobs, and remaining skeptical of unconfirmed websites that ask for personal information—would better equip everyone to use the internet.
It’s difficult to teach ethics without without being heavy-handed or making students defensive, and it is true that many students have their own pet ethical issues they would love to become mandatory instruction material. However, the expansion of computer and online technology has facilitated rapid and dramatic transformations in human living, infrastructure, and culture, and it looks to only intensify as machine learning expands. Though Harvard has largely abandoned mandating a core curriculum, expanding the Embedded EthiCS program would benefit designers and users as the digital revolution continues to transform the world.
Counterpoint: Computer Scientists Aren’t That Scary
By Michael Kielstra ’22
I regularly meet people who are scared of computers. It isn’t that they’ve given up on ever understanding them, although I meet people like that, too. They are viscerally scared that, if they do the wrong thing, the entire system will blow up in their face. As a response, they memorize exact sequences of steps to carry out very specific certain operations and call IT if anything goes wrong. They do not understand how their equipment works, and their response to any unexpected behavior is to freeze and tremble. Into this category of people I would insert whoever came up with Embedded EthiCS, a program envisioned by Harvard philosophers and computer scientists in which one lecture in participating CS classes is given over to the discussion of an ethical case study related to the material.
There is nothing wrong with Embedded EthiCS insofar as there is nothing wrong with the idea that everyone should learn ethics. My issue is with the other parts of the program’s name: the embedding, and the CS.
The first problem is smaller. Embedding ethical considerations in otherwise CS-focused courses will probably not help students actually be more ethical in the moment. While ethics-in-CS training is sufficiently new that there is not much research done about it, we can easily look at a similar case of people trying to teach potential abusers to do better. A recent study (Dobbin and Kalev, 2019) found that sexual harassment training, such as the Title IX videos that we watch before every semester, did nothing overall to stop sexual harassment, and, in fact, made men more likely to blame victims. Telling people that their behavior is wrong, especially if there is no meaningful follow-up, makes them defensive, not open to change. A large, public classroom is the wrong environment in which to teach life lessons.
The more serious issue with Embedded EthiCS is that it is restricted to CS. This sends a very dangerous message to students of all concentrations. CS concentrators learn that they must be taught ethics; non-CS concentrators, that they need not be. There is only one possible conclusion to draw from this: that the program’s administrators believe that computer scientists can do enough damage if they behave unethically that they must have their worst excesses reined in, while historians, chemists, and poets alike are weak and feeble compared to the terrible, overwhelming power of the algorithm. CS concentrators wind up with the weight of the world on their shoulders, and non-CS concentrators, already uncomfortable in a university — and a student body — that adores the tech industry and the money it brings in, are slighted once again.
It is good to behave ethically, and it is good to study ethics so that one may know how best to respond to difficult ethical situations. However, forcing the study of ethics on anyone helps nobody, and forcing it only on a selected few helps even fewer. The fears that led to the Embedded EthiCS program are, in the wake first of the Snowden revelations and then the backlash against Big Tech, tragically easy to understand. As always, though, understanding a wrong enables us only to forgive, not to condone.
Ryan Golemme (ryangolemme@college.harvard.edu) is a certified Discord dweller.
Michael Kielstra ’22 (pmkielstra@college.harvard.edu) tries to be ethical.