EA - Leaning into EA Disillusionment by Helen

The Nonlinear Library: EA Forum - Podcast autorstwa The Nonlinear Fund

Podcast artwork

Kategorie:

Link to original articleWelcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: Leaning into EA Disillusionment, published by Helen on July 21, 2022 on The Effective Altruism Forum. 1. Intro In this post I describe a phenomenon that I think is more common than we give it credit for: “EA disillusionment.” By this, I basically mean a process where someone comes into EA, engages heavily, but ends up feeling negatively toward the movement. I know at least a handful of people who have experienced this (and I’m sure there are many more I don’t know)—people who I think are incredibly smart, thoughtful, caring, and hard-working, as well as being independent thinkers. In other words, exactly the kind of people EA needs. Typically, they throw themselves into EA, invest years of their life and tons of their energy into the movement, but gradually become disillusioned and then fade away without having the energy or motivation to articulate why. I think this dynamic is bad for EA in multiple ways, some obvious, some less so. Obviously, people getting disillusioned and leaving is not fun (to put it mildly) for them, and obviously it’s bad for EA if promising people stop contributing to the movement. But I think the most important downside here is actually that it results in a major blindspot for EA: at present, the way people tend to become disillusioned means that they are (a) unusually likely to have the exact kinds of serious, thoughtful critiques that the EA community most wants to hear, but (b) unusually unlikely to offer them. The result is that EA stays blind to major problems that it could otherwise try to improve on. Why would this be true? (a) The kind of people I mean are unusually likely to have useful, major critiques to offer because they have spent years immersing themselves in the EA world, often changing careers for EA reasons, developing EA social circles, spending time on community building, and so on. (b) But, they’re unusually unlikely to offer these critiques, because by the time they have developed them, they have already spent years pouring time and energy into EA spaces, and have usually ended up despairing the state of the community’s epistemics, social dynamics, error correction processes, etc. This makes the prospect of pouring even more time into trying to articulate complicated or nuanced thoughts especially unappealing, relative to the alternative of getting some distance and figuring out what they want to be doing post-EA. I believe a healthier EA movement would be one where more people are able to go through a gentler version of the “disillusionment pipeline” described below, so that they come out the other side with useful perspectives on EA that they are more willing, able, and encouraged to share. This post aims to do 4 things: Acknowledge the existence of a significant group of people who engage heavily with EA but end up feeling negatively toward the movement (“disillusioned EAs”), who tend to fade away quietly rather than making their concerns known. Describe a 3-phase pipeline of EA disillusionment: infatuation, doubt, and distancing. Point out some of the (in my view, real and important) problems that drive people through these three stages. Make some suggestions for individuals at various stages of this pipeline, including the possibility that it’s valuable to lean into feelings of doubt/distance/disillusionment, rather than trying to avoid them. This intro has covered point (1); the rest of the post covers (2)-(4). A core idea of this post is that going through an extreme version of the first (“infatuation”) phase of the disillusionment pipeline can be very harmful. Infatuation causes people to reorient huge parts of their lives—careers, social circles, worldviews, motivational structures—around EA, making it somewhere between painful and impossible to contemplate the idea that EA might be wrong in important ways. 2....

Visit the podcast's native language site