Longtermism



Longtermism is the view that we should be doing much more to protect future generations.

Longtermism is based on the ideas that future people have moral worth, there could be very large numbers of future people, and that what we do today can affect how well or poorly their lives go. Let’s take these points one at a time. 

First, future people have moral worth. Just because people are born in the future does not make their experiences any less real or important. To illustrate this, we can put ourselves in our ancestors’ shoes and ask whether they would have been right to consider people today morally irrelevant by mere fact of not having yet been born. Another way to look at this is through considering our ability to harm future people. For instance, consider how we store nuclear waste. We do not simply set it out in the desert without further precautions, because it will start to leak in several centuries. Instead, we carefully store it and mark it for future generations, because we recognize that it would be wrong to cause future people foreseeable harm.

Second, there could be very large numbers of future people. Humanity might last for a very long time. If we last as long as the typical mammalian species, it would mean there are hundreds of thousands of years ahead of us. If history were a novel, we may be living on its very first page. Barring catastrophe, the vast majority of people who will ever live have not been born yet. These people could have stunningly good lives, or incredibly bad ones. 

Third, what we do today can affect the lives of future people in the long run. Some might argue that it is hard or impossible to predict the future, so that even if future people are morally important and even if there will be many of them, we cannot predictably benefit them beyond a hundred years time. However, while it is difficult to foresee the long-run effects of many actions, there are some things that we can predict. For example, if humanity suffered some catastrophe that caused it to go extinct we can predict how that would affect future people: there wouldn’t be any. This is why a particular focus of longtermism has been on existential risks: risks that threaten the destruction of humanity’s long-term potential. Risks that have been highlighted by longtermist researchers include those from advanced artificial intelligence, engineered pathogens, nuclear war, extreme climate change, and global totalitarianism. Besides mitigating existential risks, we can also predictably shape the longterm future by changing the trajectory of humanity in a persistent way, like through changing what it values. 

William has a book on longtermism called What We Owe The Future which was published in August and September 2022.

Learn more about longtermism in an excerpt of What We Owe The Future in The New York Times, an introductory article in BBC, and a long-form piece in Foreign Affairs. The links below are also helpful: