Astral Codex Ten - Contra DeBoer On Temporal Copernicanism
Freddie deBoer has a post on what he calls “the temporal Copernican principle.” He argues we shouldn’t expect a singularity, apocalypse, or any other out-of-distribution thing in our lifetimes. Discussing celebrity transhumanist Yuval Harari, he writes:
(I think there might be a math error here - 100 years out of 300,000 is 0.033%, not 0.33% - but this isn’t my main objection.) He then condemns a wide range of people, including me, for failing to understand this:
I deny misunderstanding this. Freddie is wrong. Since we don’t know when a future apocalypse might happen, we can sanity-check ourselves by looking at past apocalyptic near-misses. The closest that humanity has come to annihilation in the past 300,000 years was probably the Petrov nuclear incident in 1983¹, ie within Freddie’s lifetime. Pretty weird that out of 300,000 years, this would be only 41 years ago! Maybe you’re more worried about environmental devastation than nuclear war? The biggest climate shock of the past 300,000 years is . . . also during Freddie’s lifetime². Man, these three-in-a-thousand coincidences keep adding up! “Temporal Copernicanism”, as described, fails basic sanity checks. But we shouldn’t have even needed sanity checks as specific as these: common sense already tells us that new apocalyptic weapons and environmental disasters were more likely to arise during the 20th century than, say, the century between 184,500 BC and 184,400 BC. What’s Freddie doing wrong, and how can we do better? The following argument is loosely based on one by Toby Ord. Consider three types of events: First, those uniformly distributed across calendar time. For example, asteroid strikes are like this. Here Freddie is completely right: if there are 300,000 years of human history, and you live 100 years, there’s an 0.03% chance that the biggest asteroid in human history strikes during your lifetime. Because of this, most people who think about existential risk don’t take asteroid strikes too seriously as a potential cause of near-term apocalypses. Second, those uniformly distributed across humans. This is what you might use to solve Sam Bankman-Fried’s Shakespeare problem - what’s the chance that the greatest playwright in human history is alive during a given period? Freddie sort of gets this far³, and provides a number: 7% of humans who ever lived are alive today⁴. Third, those uniformly distributed across techno-economic advances. You’d use this to answer questions like “how likely is it that the most important technological advance in history thus far happens during my lifetime?” This seems like the right way to predict things like nuclear weapons, global warming, or the singularity. But it’s harder to measure than the previous two. You could try using GDP growth. At the beginning of Freddie’s life, world GDP (measured in real dollars) was about $40 trillion per year. Now it’s about $120 trillion. So on this metric, about 66% of absolute techno-economic progress has happened during Freddie’s lifetime. But we might be more interested in relative techno-economic progress. That is, the Agricultural Revolution might have increased yields from 10 bushels to 100 bushels of corn. And some new tractor design invented yesterday might increase it from 10,000 bushels to 10,100 bushels. But that doesn’t mean the new tractor design was more important than the Agricultural Revolution. Here I think the right measure is log GDP growth; by this metric, about 20% of techno-economic progress has happened during Freddie’s lifetime. Freddie sort of starts thinking in this direction⁵, but shuts it down on the grounds that some people think technological growth rates have slowed down since the mid-20th century. Usually the metric that gets brought out to support this is changes in total factor productivity, which do show the mid-20th century as a more dynamic period than today. So fine, let’s do the same calculation with total productivity. My impression from eyeballing this paper is that about 35% of all increase in TFP growth and 15% of all log TFP growth has still happened during Freddie’s lifetime. So what’s our prior that the most exciting single technological advance in history thus far will happen during Freddie’s lifetime? I think a conservative number would be 15%⁶. How do we move from “most exciting advance in history” to questions about the singularity or the apocalypse? Robin Hanson cashes out “the singularity” as an economic phase change of equal magnitude to the Agricultural or Industrial Revolutions. If we stick to that definition, we can do a little better at predicting it: it’s a change of a size such that it’s happened twice before. Using our previous number, we estimate ~30% chance that such a change happens in our lifetime. (sanity check: the last such earth-shattering change was the Industrial Revolution, about 3 - 4 lifetimes ago.) What about the apocalypse? This one is tougher. Freddie tries to do an argument from absurdity: suppose the apocalypse happened tomorrow. Wouldn’t it be crazy that, you, of all the humans who have ever existed, were correct when you thought the apocalypse was nigh? No, it’s not crazy at all. If the apocalypse happens tomorrow, then 7% of humans throughout history would have been right to predict an apocalypse in their lifetime. That’s not a such a low percent - your probability of being born in the final generation is about the same as (eg) your probability of being born in North America. Here’s a question I don’t know how to answer - the number above (7%) is about how surprised you should be if the apocalypse happens in your lifetime. But I don’t think it’s the overall chance that the apocalypse happens in your lifetime, because the apocalypse could be millions of years away, after there had been trillions of humans, and then retroactively it would seem much less likely that the apocalypse happened during the 21st century. So: is it possible to calculate this chance? I think there ought to be a way to leverage the Carter Doomsday Argument here, but I’m not quite sure of the details. Speaking of the Carter Doomsday Argument… …Freddie is re-inventing anthropic reasoning, a well-known philosophical concept. The reason why the hundreds of academics who have written books and papers about anthropics have never noticed that it disproves transhumanism and the singularity is because Freddie’s version has obvious mistakes that a sophomore philosophy student would know better than to make. (local Substacker Bentham’s Bulldog is a sophomore philosophy student, and his anthropics mistakes are much more interesting.) The world’s leading expert on anthropic reasoning is probably Oxford philosophy professor Nick Bostrom, who literally wrote the book on the subject. Awkwardly for Freddie, Bostrom is also one of the founders of the modern singularity movement. This is because, understood correctly, anthropics provides no argument against a singularity or any other transhumanist idea, and might (weakly) support them. I think if you use anthropic reasoning correctly, you end up with a prior probability of something like 30% that the singularity (defined as a technological revolution as momentous as agriculture or industry) happens⁷ during your lifetime, and a smaller percent that I’m not sure about (maybe 7%?) that the apocalypse happens during your lifetime. None of these probabilities are lower than the probability that you’re born in North America, so people should stop acting like they’re so small as to be absurd or impossible. But also, prior probabilities are easy-come, easy-go. The prior probability that you’re born in Los Angeles is only 0.05%. But if you look out your maternity ward window and see the Hollywood sign, ditch that number immediately and update to near certainty. No part of anthropics should be able to prevent you from updating on your observations about the world around you, and on your common sense. (except maybe the part about how you’re in a simulation, or the part about how there’s definitely a God who created an infinite number of universes, or how there must be thousands of US states, or how the world must end before 10,000 AD, or how the Biblical Adam could use his reproductive decisions as a shortcut to supercomputation, or several other things along these same lines. I actually hate anthropic reasoning. I just think that if you’re going to do it, you should do it right.) 1 The Toba supervolcano is over-rated. You could argue Cuban Missile Crisis was worse than Petrov, but that just brings us back 60 years instead of 40, which I think still proves my point. 2 Something called “the Eemian” 130,000 years ago was larger in magnitude, but happened gradually over several thousand years. 3 If he got this far halfway down, why did he even present the obviously-wrong 0.03% number as his headline result? Was he hoping we wouldn’t read the rest of his post? 4 This is slightly wrong for the exact framing of the question; your life is a span rather than a point, so probably by the time you die, about 10% of humans will have been alive during your lifespan. The exact way you think about this depends on how old you are, and I’ll stick with the 7% number for the rest of the essay. 5 Again, I don’t understand why he bothered giving the earlier obviously-wrong-for-this-problem numbers, vaguely half-alluded to the existence of this one in order to complain that someone could miscalculate it, and then put no effort into calculating it correctly or at least admitting that he couldn’t calculate the number that mattered. 6 Some of these numbers depend on how you’re thinking of “lifespan” vs. “lifespan so far” and how much of your actually-existing foreknowledge about the part of your life you’ve already lived you’re using. I’m just going to handwave all of that away since it depends on how you’re framing the question and doesn’t change results by more than a factor of two or three. 7 Realistically the Agricultural and Industrial Revolutions were long processes instead of point events. I think the singularity will be shorter (just as the Industrial Revolution was shorter than the Agricultural), but if this bothers you, imagine we’re talking about the start (or peak) of each. You're currently a free subscriber to Astral Codex Ten. For the full experience, upgrade your subscription. |
Older messages
Open Thread 346
Monday, September 9, 2024
... ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏
Your Book Review: The Pale King
Friday, September 6, 2024
Finalist #12 in the Book Review Contest ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏
Highlights From The Comments On "Sorry You Feel That Way"
Thursday, September 5, 2024
... ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏
Interview Day At Thiel Capital
Tuesday, September 3, 2024
... ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏
Open Thread 345
Monday, September 2, 2024
... ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏
You Might Also Like
25 Things on Sale for Prime Day — at Their Lowest Price Ever
Wednesday, October 9, 2024
Plus: The best deals we scrounged up under $50. The Strategist Every product is independently selected by editors. If you buy something through our links, New York may earn an affiliate commission. 25
What A Day: Facepalm before the storm
Tuesday, October 8, 2024
MAGAworld is spreading misinfo ahead of Hurricane Milton, sewing fear and undermining disaster relief efforts. ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏
Inside The Jail Block Run By January 6 Rioters
Tuesday, October 8, 2024
Columns and commentary on news, politics, business, and technology from the Intelligencer team. Intelligencer crime Inside the Patriot Wing January 6 rioters are running their jail block like a gang.
Welcome to the podcast election
Tuesday, October 8, 2024
PLUS: More mainstream media veterans make the jump into independent media. ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏
Wednesday Briefing: An Israeli airstrike in Syria
Tuesday, October 8, 2024
Plus, revisiting Oppenheimer's Communist ties. View in browser|nytimes.com Ad Morning Briefing: Asia Pacific Edition October 9, 2024 Author Headshot By Gaya Gupta Good morning. We're covering
You Want Fries With That?
Tuesday, October 8, 2024
Sliced Potatoes, Trump Loves Putin ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏
Why former Amazon exec Dave Clark moved back to Seattle
Tuesday, October 8, 2024
Smartsheet unveils redesign | Startups tackle scams targeting seniors ADVERTISEMENT GeekWire SPONSOR MESSAGE: Electronics Fair Opens in Hong Kong: One-stop sourcing event for innovative electronics
Your Prime Day cheat sheet
Tuesday, October 8, 2024
Gifts, gifts, gifts View in browser The Recommendation Happy Prime Day to all who celebrate. We're starting (extremely) early on holiday gifts, because there are some great ones on sale today. Also
☕ Name of the game
Tuesday, October 8, 2024
How the Professional Women's Hockey League's teams found their names. October 08, 2024 Marketing Brew It's Tuesday. A limited-edition Chicken Big Mac is set to debut at McDonald's in
☕ Tools of the tirade
Tuesday, October 8, 2024
Right-to-repair laws. October 08, 2024 Retail Brew Presented By Particl It's Tuesday, and early holiday shopping heats up today with the first day of Amazon's October sales event for Prime