a utopian vision or something darker?
Ars chats with physicist and science journalist Adam Becker about his new book, More Everything Forever.
It's long been the stuff of science fiction: humans achieving immortality by uploading their consciousness into a silicon virtual paradise, ruled over by a benevolent super-intelligent AI. Or maybe one dreams of leaving a dying Earth to colonize Mars or other distant planets. It's a tantalizing visionary future that has been embraced by tech billionaires in particular. But is that future truly the utopian ideal, or something potentially darker? And are those goals even scientifically feasible?
These are the kinds of questions astrophysicist and science journalist Adam Becker poses in his new book, More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity. Becker's widely praised first book, What Is Real?, focused on competing interpretations of quantum mechanics and questioned the long dominance of the so-called Copenhagen interpretation championed by Niels Bohr, among other luminaries. This time around, he's tackling Silicon Valley's far-reaching ideas about the future, which have moved out of online subcultures and into mainstream culture, including our political discourse.
"It seemed like it was only going to become more relevant and someone needed to speak out about it, and I didn't see enough people connecting the dots in a way that looked right to me," Becker told Ars. "One current critique of Silicon Valley is that they moved fast and broke democracy and institutional norms. That's true. Another is that they're contemptuous of government, and I think that's true, too. But there wasn't much critique of their visions of the future, maybe because not enough people realized they meant it. Even among Silicon Valley critics, there was this idea that at the very least, you could trust that the statements they made about science and technology were true because they were experts in science and technology. That's not the case."
More Everything Forever covers the promise and potential pitfalls of AI, effective altruism, transhumanism, the space race to colonize Mars, human biodiversity, and the singularity, among many other topics—name-checking along the way such technological thought leaders as Eliezer Yudkowsky, Sam Altman, William MacAskill, Peter Singer, Marc Andreessen, Ray Kurzweil, Peter Thiel, Curtis Yarvin, Jeff Bezos, and yes, Elon Musk. It all boils down to what Becker calls the "ideology of technological salvation," and while its uber-rich adherents routinely cite science to justify their speculative claims, Becker notes that "actual scientific concerns about the plausibility of these claims" are largely dismissed. For Becker, this ideology represents a profound threat, not the promise of a utopian future.
"More than anything, these visions of the future promise control by the billionaires over the rest of us," Becker writes in his introduction. "But that control isn't limited to the future—it's here, now. Their visions of the future are news; they inform the limits of public imagination and political debate. Setting the terms of such conversations about the future carries power in the present. If we don't want tech billionaires setting those terms, we need to understand their ideas about the future: their curious origins, their horrifying consequences, and their panoply of ethical gaps and scientific flaws."
Ars caught up with Becker to learn more.
Ars Technica: The title of your book is More Everything Forever. Speaking as a physicist, is there such a thing?
Adam Becker: No, of course not. The one thing we know that's absolutely always true about exponential growth is that it ends. If something is growing exponentially, you can just say, "Oh, well, that's not going to last." The classic example from nature is growth of a bacterial colony. If you've got a couple of bacteria and a really nice, happy growth medium and a Petri dish, they're going to grow exponentially until they fill the Petri dish, eat all of the agar, and die, and then the growth ends.
We even take advantage of this in so many things in our everyday lives. This is how you make beer and wine. You put the yeast in the growth medium, it eats all the sugar and grows exponentially and excretes alcohol, and then it dies due to a combination of its own waste products and lack of food. Once it's done, then we drink it.
Ars Technica: If nothing else, eventually one always comes up against the laws of thermodynamics, especially entropy.
Adam Becker: I've got a magnet on my fridge right now that says the heat death is coming. Certain Silicon Valley visionaries hate the laws of thermodynamics. Others claim that their ideas are thermodynamically inevitable because they've misunderstood thermodynamics. But either way, they've got to grapple with it because it's the ultimate source of these limits. If nothing else stops you, thermodynamics will stop you because entropy is always going to increase.
Extropy is just a term for low-entropy forms of energy, and it's getting smaller all the time. This is just a fact about how the universe works, and it's also a fact that makes life possible. As [Erwin] Schrodinger said, life intercepts flows of low-entropy energy and turns it into higher-entropy energy, and that's one of the things that life does. But to these guys, the second law of thermodynamics is a cosmic crisis. That's not only silly, but it sounds like a really unpleasant way to go through life. To achieve what they want to achieve requires so much energy, but they still won't be able to do it because a lot of what they want is just not possible.
Ars Technica: One doesn't usually think of this community as being anti-science. Yet in your book you argue that these futuristic visions aren't based on sound science at all, which seems paradoxical.
Adam Becker: They actually have a great contempt for expertise. They don't see it as necessary because they think that they're the smartest people who've ever lived, because they're the wealthiest people who've ever lived. If they were wrong about anything, then why would they have been so financially successful? This is also where you get the obsession with things like prediction markets. They believe that there are super predictors, that expertise is not necessary to understand or predict what's going to happen in the world, and that they themselves must be experts because they have enormous amounts of money.
There's a Homer Simpson quote that I like for thinking about this. It's just something that Homer blurts out when someone's talking about someone who lived 150 years ago. He says, "If he's so smart, how come he's dead?" When faced with expertise, they'd be like, "Well, if he's so smart, how come he's poor?" They believe that everything can be quantified, like the utilitarian dream, the eugenicist's dream, a person's IQ, and that money is a good measure of how much someone is worth. So they must be the best and smartest and greatest people who have ever lived.
Ars Technica: Your AI take is that neither the dystopian view of sentient AI or the utopian view is necessarily correct. What is a realistic picture of what AI can do and be?
Adam Becker: There was a really good article about this in The New York Times by Tressie McMillan Cottom, where she said that AI is "mid." It's not great. It's not terrible. It's just mediocre. The AI art is a great example of that: It is somewhere between mediocre [and] terrible, and so is the writing. My general line about it is these AI systems are using the cultural heritage of humanity to spit out a smeared-out, averaged voice of all of that, but it doesn't have a conception of what's in the world and is incapable of doing most of the things that humans do.
A large language model is never going to do a job that a human does as well as they could do it, but that doesn't mean that they're never going to replace humans, because, of course, decisions about whether or not to replace a human with a machine aren't based on the actual performance of the human or the machine. They're based on what the people making those decisions believe to be true about those humans and those machines. So they are already taking people's jobs, not because they can do them as well as the people can, but because the executive class is in the grip of a mass delusion about them.
Ars Technica: You also have some sharp criticisms of the dream of colonizing Mars, which echoes in many respects the critiques raised by Zach and Kelly Weinersmith in their 2023 book, A City on Mars.
Adam Becker: I cite that book generously. Douglas Adams put it really well. He said space is big, really, really big. You may think it's a long way down to the chemist, but that's just peanuts compared to space. Space is big, and it's not filled with very much. We live in a weird place. There's more stuff here. The average density of matter in space is something like one atom per cubic meter. We live in a place that has a ridiculous over-density of stuff.
Also, all of the interesting places in space are really far apart. Living on Mars sucks. Mars isn't even mid. Mars is just crappy. The gravity is too low. The radiation is too high. There's no air. The dirt is made of poison. There's very little water. It gets hit with asteroids more often than Earth does because it's closer to the asteroid belt. And the prospects for terraforming technology in any meaningful way are not great. Making Mars as habitable as Antarctica during the polar night would be the greatest technological undertaking humanity has ever taken by many orders of magnitude, in order to create a place that nobody would want to live, and where the gravity would still be too low. It's a deeply unpleasant place.
But it is the next most habitable place in the solar system after Earth, at least if we're looking at solid surfaces. There's a spot in the atmosphere of Venus that is arguably nicer, but it's 60 miles up and you're surrounded by sulfuric acid at all times. There's still no air to breathe. Mars is also one of the closest planets to Earth. Going to Mars is roughly equivalent to taking a flight across the country when you compare it to getting to Alpha Centauri. Sure, we've been to the Moon. The Moon is a lot closer than anything else. It only takes three days to get there. We still haven't colonized the Moon, and there's very good reasons why. It's just very difficult, there's not a lot there, and it's a pretty unpleasant place.
Ars Technica: What is it about these futuristic narratives that is so seductive?
Adam Becker: I think part of it is that it lines up with ideas about the future that have been sold to us for many, many years by, among other things, science fiction. I say this as an ardent science fiction fan. I am not saying that science fiction is the problem. I'm saying that taking science fiction seriously as a vision of the future is the problem.
I love Star Trek. I don't believe that we can build warp drive just because I like Star Trek. I also think that Star Trek isn't really about warp drive. It's about people and how we relate to each other in the same way that all great art is. It's about being here now in the world and what it means to be a person. Yet someone like Peter Thiel looks at Dune and says that the message of Dune is "develop the deserts," which is not even close to what Frank Herbert had in mind or what any reading of the text could support.
These ideas resonate because there's this idea that the future inevitably lies in space, that the future inevitably involves super-intelligent machines. There is nothing inevitable about those visions and a great deal of reason to doubt them. I also think that in a time of great turmoil, the idea of a guaranteed utopia coming from these movers and shakers in the world, these very powerful, wealthy billionaires—it sounds hopeful until you scratch at it a little bit and realize that there's nothing there.
Ars Technica: Yet there is a long history of science fiction inspiring people to become scientists and invent things like the flip phone, which was inspired by Star Trek's handheld communicators. Science fiction can inspire and help us dream. It's not just about dystopian nightmares.
Adam Becker: At the end of the book, I quote Ursula LeGuin—who herself said science fiction isn't about predicting the future—saying that we need better science fiction. There's some tension there. I don't think there's an easy answer. I think it's always going to be easier to build a dystopia than a utopia and easier to build a utopia than something realistic. I think realistic, hopeful visions of the future are something that it would be good to have more of. But I also think the problem isn't really science fiction, it's the reading of science fiction. There's incredibly problematic science fiction out there, especially some of the older stuff. I have a lot of problematic faves in science fiction, but I think it's okay as long as you say, "Okay, it's problematic, and here's how. I still like it." But I'm not going to use Robert Heinlein as a political roadmap because his politics were trash.
I don't think there's anything wrong with taking science fiction as a source of inspiration to a point. The problem is when you get people saying science fiction should not remain fiction forever. It's not a roadmap. You are not Hari Seldon. This is not Foundation, which is a cautionary tale. Foundation is the decline and fall of the Roman Empire in space. Something like Civilization is a lot of what's going on here as well—the idea of a tech tree that's predefined by the game, and you just choose how quickly you're going to advance along that tree. That's just not how anything in human history has ever worked. It's a dangerous belief.
Generally, more careful and critical readings of science fiction would be the way to go. The humanities are deeply undervalued, and I think it's good to learn to read with a more critical eye. But I also think some people are never going to be able to get it. You don't need a critical eye in order to understand that Star Trek is not about space, it is actually about society. Just look at The Original Series, like the episode with the guys with the faces that are half black and half white. Some people are upset about "woke Star Trek." In what world was Star Trek ever not woke? I don't think that we can lay the blame for those people at the feet of science fiction. It's just those people being how they're going to be.
Ars Technica: Ray Kurzweil's influential ideas about the singularity have been around for a long time. I found your section on Kurzweil losing his father and wanting to build a "dad bot" to maintain that vital connection quite moving. There's something so human about that desire, a longing for immortality, our existential dread of death—of not being. We might think immortality is the answer, but I'm not convinced.
Adam Becker: Kurzweil tries to get around this by saying that you're not going to be immortal, but you can live as long as you want to. Sure, that gets around some of it. But Kurzweil also thinks that we're going to find a way around the second law of thermodynamics, which we're not. I do think that fear of death is at the root of a lot of this, if not all of it. I don't know if I would go as far as to say that death is what gives life meaning. I would say that the human experience is defined by the limitations that death imposes, the fact that our time is limited. If you remove that constraint, that would fundamentally alter the human condition in ways that very well might not be pleasant.
I realize that's kind of a mild statement, and it's deliberately mild. I don't outright condemn anyone in the book for seeking to evade death. I think it's totally natural to be afraid of death, but I also don't think we have any choice about it. Even if you had phenomenally advanced technology, ultimately, you would still end up dying at some point, thanks to the second law of thermodynamics. I mean, the Universe is going to die.
There's this very dualistic idea that the mind can somehow be separated from our bodies, and I don't think it can be. There's very good evidence that's not true. There's also good evidence that these human bodies of ours can only be extended so long. I think it is unhealthy to have such an overweening fear of death that you make it the single motivating thing in your life, because you're not going to find a way around death, and it might not be a good thing if you did.
Jennifer is a senior writer at Ars Technica with a particular focus on where science meets culture, covering everything from physics and related interdisciplinary topics to her favorite films and TV series. Jennifer lives in Baltimore with her spouse, physicist Sean M. Carroll, and their two cats, Ariel and Caliban.