Can an Online Course Help Big Tech Find Its Soul?

The Foundations of Humane Technology is an eight-hour class for Silicon Valley’s disillusioned workers.
Silhouettes of three people dressed in business clothes talking while standing around a laptop
Photograph: Henrik Sorensen/Getty Images

“Few technologists show up to work every day thinking about the harms technology may create,” the Foundations of Humane Technology begins. “This course is an invitation to embrace the agency you already have.” It’s exactly what Brandon Read needed to hear.

Read had joined Lyft as a senior software engineer in May 2019, expecting to make a difference in the world. The ride-sharing company was preparing to fully electrify its vehicle fleet by 2030, and Read’s team, which managed the fleet, would help to make that possible. “It felt more values-oriented than some of my past roles in tech,” he says. “It was like, if I turn this knob, it has an impact.”

Later that year, Lyft went public, and Read’s team upped the pricing of its car-rental program. It made him uncomfortable—how would that affect Lyft’s drivers?—but his managers reminded him that they needed to focus on profitability. Then, in 2020, the company spent millions fighting a ballot initiative that would reclassify its drivers as employees. “I had some concern around that,” says Read, who noticed that Lyft was advertising against the proposition in its own app. Eventually, he decided his values weren’t aligned with the company’s. He left at the end of 2020.

It was around that time that Read started asking himself the big existential questions: Who am I? What have I been doing with my life? Being an engineer was part of his identity, but so was being in touch with social and environmental issues. He took some time off, considered going back to school, and wondered whether there was a place for him in tech. Then a few months ago, while Googling around for answers, Read discovered an online course that seemed poised to answer all of these questions: the Foundations of Humane Technology.

The course, which comes out of beta today, was developed by the Center for Humane Technology. The nonprofit’s purpose has mainly been to give language to the uneasiness around technology’s impact on society, popularizing terms like “time well spent” (a metric to replace engagement on screens) and “human downgrading” (to describe the cumulative negative effect of technology on peoples’ cognition). The organization’s cofounder is Tristan Harris, an ex-Googler who called attention to the search giant’s extractive features in 2013. He has since left the industry and made a career out of rehabilitating it.

On the Center for Humane Technology’s website, Read saw a few of its recent projects, including the 2020 Netflix documentary The Social Dilemma, which memified the idea that “if you're not paying for the product, then you are the product.” Then he clicked to learn more about the course on offer, marketed to technologists who wanted to “treat attention and intention as sacred, protect well-being, minimize unforeseen consequences, and ultimately build our collective capacity to address the most urgent challenges facing humanity.” Read was intrigued, and unemployed. He enrolled.

Most of the Center for Humane Technology’s work to date has focused on identifying the big problems in tech. But after The Social Dilemma aired, Randima Fernando, the center’s executive director, says a wave of people in the tech industry approached him and demanded solutions. “They were like, great, I get it, we’re worried. What do we do now?”

The new course is meant, in part, to answer that question, speaking directly to rehabilitated techies like Read. It contains eight modules and is intended to take about eight hours total, plus additional time spent on worksheets, reflection exercises, and optional discussion groups over Zoom. Read, who “binged” the course, says he completed it in about two weeks.

For people who have spent years studying the harmful externalities of the tech industry, the course might feel short on insight. Yes, social media companies exploit human weaknesses—what’s new? But for those just arriving to those ideas, it provides some useful jumping off points. One module focuses on the psychology of persuasive tech and includes a “humane design guide” for creating more respectful products. Another encourages technologists to identify their highest values and the ways those values interact with their work. At the end of the lesson, a worksheet invites them to imagine sipping tea at age 70, looking back on their life. “What’s the career you look back on? What are the ways you’ve influenced the world?”

Subtle? Not exactly. Even still, Fernando believes the tech industry is so badly in need of a wake-up call that these worksheets and journal prompts might give tech workers a moment to consider what they’re building. Suparna Chhibber, who left a job at Amazon in 2020, says the pace of the tech industry doesn't always leave room for people to reflect on their purpose or values. “People get paid a lot to push things through, and if you’re not doing that, then you’re basically failing,” she says.

Chhibber enrolled in the Foundations of Humane Technology around the same time as Read and found a community of like-minded people waiting to discuss the material over Zoom. (The Center for Humane Technology leads the sessions, and plans to continue them.) Read described these sessions like group therapy: “You get to know people who you feel safe exploring these topics with. You can open up.” Critically, it reminded him that, although many people don’t understand why he left his prestigious job, he is not alone.

The Center for Humane Technology is not the first organization to make a tool kit for concerned tech workers. The Tech and Society Solutions Lab has released two, in 2018 and 2020, designed to encourage more ethical conversations within tech companies and startups. But the center’s new course is novel in the way that it tries to create community out of the burgeoning “humane tech” movement. A single concerned engineer is unlikely to change a company’s business model or practices. Together, though, a group of concerned engineers might make a difference.

The Center for Humane Technology says that more than 3,600 tech workers have already started the course, and several hundred have completed it. “This is by far the biggest effort we’ve made to convene humane technologists,” says David Jay, the center’s head of mobilization. The center says it has amassed a long list of concerned technologists over the years and plans to promote the course directly to them. It also plans to get the word out through a few partner organizations and through its “allies inside of a wide range of technology companies, including many of the major social media platforms.”

If there ever was a moment for the tech industry to band together and reconstitute its values, it would be now: Tech workers are in high demand, and companies are increasingly at the whim of their desires. Still, workers who have tried to raise flags haven’t always been listened to. It seems unlikely that these companies will reorient their business incentives—away from profits and toward social consciousness—without greater pressures, like regulation. Chhibber, who says she tried to infuse “humane tech” principles into her teams at Amazon, didn’t find that it was enough to change the company’s overall culture. “If you have the business model breathing down your back,” she says, “it’s going to impact what you do.”

The last module of Foundations of Humane Technology tries to address this, with a lesson called “Changing the Culture of Technology.” It encourages tech workers to find collaborators, start with small changes, and push for new definitions of “success” in their work. It also asks them to reflect on a worksheet they filled out earlier in the course, which measured “the gap between your life today and what you most care about and believe.”

But for many tech workers, that gap may be too large to close. It’s no surprise that some of the industry’s loudest critics, including Tristan Harris, have decided to leave big tech altogether. Those who try to reform it from within often get swept aside. For Read, who plans to enroll in a master’s program in design engineering this fall, there wasn’t an obvious way to continue working ethically in tech. Rather than inspire him to get another tech job, the Foundations of Humane Technology reminded him which “side of history” he wanted to be on.


More Great WIRED Stories