Astral Codex Ten - Trying Again On Fideism
[apologies for an issue encountered when sending out this post; some of you may have gotten it twice] Thanks to Chris Kavanagh, who wrote an extremely kind and reasonable comment in response to my Contra Kavanagh on Fideism and made me feel bad for yelling at him. I’m sorry for my tone, even though I'm never going to get a proper beef at this rate. Now that I'm calmed down, do I disagree with anything I wrote when I was angrier? Chris was too nice to really defend himself, but a few other people posted what I think of as partial arguments for the position I mocked as "fideism". For example, Scott Aaronson:
In thinking about these kinds of questions, I find it helpful to consider three reflexive naive positions towards conspiracy theories (and cults, and misinformation, and general false ideas). All of these are caricatures, but hopefully they’ll help refine the borders of the debate: Idiocy: Conspiracy theories are a thing dumb people sometimes fall for. If you understand that facts require evidence, and you’re not a Nazi trying to explain why the Jews caused 9-11, then there’s basically no chance you fall for them. You mostly have to stay away from outright lies - for example, someone making up a story about a Jew admitting to causing 9-11 - which is easy to do, because you can just fact-check these. Intellect: There is no difference between conspiracy theories and any other theory, except that the conspiracy theories are worse. There are some theories that the smartest experts give 50-50 odds of being true, like “high wages caused the Industrial Revolution”. There are some theories that the smartest experts give a 10-90 odds of being true, like “endocrine disruptors are a major cause of rising LGBTQ identification”. And there are some theories that the smartest experts give 0.001 - 99.999 odds of being true, like “the Illuminati singlehandedly caused the French Revolution”. All of these theories should be treated approximately the same way, as intellectuals discussing difficult questions and sometimes, if they’re not smart enough to be up to the task, coming to the wrong answer. Infohazard: Conspiracy theories are deadly traps lying that lie in wait for everyone, including smart people. If you stumble on one unprepared, it will eat you up, turn you into an adherent, and leave you and society worse off. You should exercise standard infohazard precautions around them, like putting wax in your ears if you’re passing through somewhere you might hear them discussed, or tying yourself to the mast if you’re out of wax. If you have neither wax nor a mast, you can usually come out unscathed by reciting “trust experts . . . trust experts . . . trust experts” over and over as a mantra. One advantage of the Idiocy perspective is that it makes conspiracy theories low status. Most people don’t want to seem like idiots; if their friends think anyone who believes in conspiracy theories is an idiot, they’ll take extra care to stay away from them. But a disadvantage - one I find overwhelming - is that when you do come across a conspiracy theory, you’re totally blindsided by it. Since you “know” conspiracy theories only sound convincing to idiots, and you “know” you’re not an idiot, this convincing thing you just heard can’t be a conspiracy theory! It must be a legitimately true thing that Big Pharma is suppressing! Everyone knows Big Pharma sometimes suppresses stuff, that’s not a . . . This is why I stress, again and again, that good conspiracy theories have lots of convincing-sounding evidence in their favor, and may sound totally plausible to a smart person reasoning normally. When people shrug off conspiracy theories easily, it’s either because the conspiracy theory isn’t aimed at them - the equivalent of an English speaker feeling smug for rejecting a sales pitch given entirely in Chinese - or because they’re biased against the conspiracy theory with a level of bias which would also be sufficient to reject true theories. Sure, everything went well this time - they were able to resist believing the theory - but one day they’ll encounter a sales pitch in English, on a topic where it accords with their biases. Then they’ll be extra-double-doomed because of how sure they are that they’re immune to propaganda. When people criticize me, they act like I’m 100% taking the Intellect perspective. I admit I have some sympathies in that direction. Ivermectin is an especially clear case: for a while, most doctors and epidemiologists suspected that it worked, because there were impressive studies in favor. Then those impressive studies were gradually found to be flawed or fraudulent, better studies gradually came out showing that it didn’t work, and the experts gradually shifted to doubting it. At what point in this process - which second of which day - did it switch from plausible-but-false scientific theory to conspiracy theory? Obviously there’s no single moment (cf. philosophy of science’s long failure to solve the demarcation problem). So the difference between a good scientific theory and a conspiracy theory is definitely a spectrum. But I think this meshes just fine with the Infohazard perspective. There are many arguments, very closely resembling correct arguments, that play on various biases and subtle errors of reasoning, and end out unfairly convincing. I like to call biases “cognitive illusions”, by analogy to optical illusions, which can also be unfairly convincing: This is my favorite illusion. The top and bottom chess sets are the same color, and only look black vs. white because of contrast effects. This one is harmless, because it affects everyone equally, nobody cares about it too much, and you can easily check via Paint or Photoshop or something. The Infohazard perspective claims conspiracy theories are potentially this convincing, but in a much more pernicious way: they only hit some people (not necessarily the dumb ones!), and they subvert the checking process so that it appears to give pro-conspiracy results (see Trapped Priors). All factual claims can become the basis for emotional/social coalitions. I wrote here about how an extremely pointless question - whether Abu Bakr or Ali should have been political leader of the Arabian empire in 632 AD - produced the Sunni/Shia split, whose different sides went on to develop different political systems, aesthetics, and philosophies, and to hate each other even today. It’s easy for a scissor statement like “is the chess set black or white?” to become the basis for a social/political movement, which then evolves the anti-epistemology necessary to protect its own existence (I’m still in awe of the way ivermectin advocates have made “small studies are more trustworthy than big studies” sound like a completely reasonable and naturally-arrived-at position). I agree that everyone (including smart people) needs to be constantly vigilant of this possibility, and that any suggestion otherwise risks placing a stumbling block before the blind. II. Where I differ from Alexander is something like - quick analogy, there used to be a thing where some therapists would avoid asking patients if they were suicidal, because they didn’t want to “plant the idea” in their head. People would argue that you shouldn’t talk at length about the reasons for and against suicide, because that was highlighting it as an option, or dignifying it with a response. Most studies have since weighed in against this perspective. Depressed people aren’t idiots. They are aware that committing suicide is an option. You will never be able to suppress all knowledge of suicide’s existence, and “suddenly triggering the latent knowledge” isn’t a thing. Talking about it openly just means . . . it can be talked about it openly. We currently live in a world where:
Consider the possibility that the cat is already out of the bag, and that me writing a negative article, against ivermectin, on ACX isn’t going to extract the cat any further. “C’mon, bro, just one more chance, bro, denying it of oxygen will totally work this time, bro, please, just one more chance!” At some point, you have to acknowledge that people who want to hold up examples of people taking ivermectin seriously can already point to the critical care doctors and senators and guideline-makers, and that maybe the time has come to start arguing against it in some way. Eliezer Yudkowsky’s position is Let Them Debate College Students. I’m not a college student, but I’m not Anthony Fauci either, and I am known for blogging about extremely dignified ideas like the possibility that the terrible Harry Potter fanfiction My Immortal is secretly an alchemical allegory. I haven’t seen ivermectin advocates using “Scott takes this seriously enough to argue against it!” as an argument, and I have seen them getting angry about it and writing long responses trying to prove me wrong. Sometimes they have used me getting some points wrong as a positive argument, and I would be open to the argument that I failed in not arguing against it well enough that they couldn’t do that, but nobody has been making that argument, and if they did, then it would imply that people who are smarter than me should take over the job, which I endorse. III. I worry Scott Aaronson thinks I’m saying you shouldn’t trust the experts, and instead you should always think for yourself. I’m definitely not trying to say that. I’ve tried to be pretty clear that I think experts are right remarkably often, by some standards basically 100% of the time - I realize how crazy that sounds, and “by some standards” is doing a lot of the work there, but see Learning To Love Scientific Consensus for more. Bounded Distrust also helps explain what I mean here. I also try to be pretty clear that reasoning is extremely hard, it’s very easy to get everything wrong, and if you try to do it then a default option is to get everything wrong and humiliate yourself. I describe that happening to me here, and presumably it also happens to other people sometimes. What I do think is that “trust the experts” is an extremely exploitable heuristic, which leads everyone to put up a veneer of “being the experts” and demand that you trust them. I come back to this example again and again, but only because it’s so blatant: the New York Times ran an article saying that only 36% of economists supported school vouchers, with a strong implication that the profession was majority against. If you checked their sources, you would find that actually, it was 36% in favor, 19% against, 46% unsure or not responding. If you are too quick to seek epistemic closure because “you have to trust the experts”, you will be easy prey to people misrepresenting what they are saying. I come back to this example less often, because it could get me in trouble, but when people do formal anonymous surveys of IQ scientists, they find that most of them believe different races have different IQs and that a substantial portion of the difference is genetic. I don’t think most New York Times readers would identify this as the scientific consensus. So either the surveys - which are pretty official and published in peer-reviewed journals - have managed to compellingly misrepresent expert consensus, or the impressions people get from the media have, or “expert consensus” is extremely variable and complicated and can’t be reflected by a single number or position. And I genuinely think this is part of why ivermectin conspiracies took off in the first place. We say “trust science” and “trust experts”. But there were lots of studies that showed ivermectin worked - aren’t those science? And Pierre Kory MD, an specialist in severe respiratory illnesses who wrote a well-regarded textbook, supports it - isn’t he an expert? Isn’t it plausible that the science and the experts are right, and the media and the government and Big Pharma are wrong? This is part of what happens when people reify the mantras instead of using them as pointers to more complicated concepts like “reasoning is hard” and “here are the 28,491 rules you need to keep in mind when reading a scientific study.” IV. All of this still feels rambly and like it’s failing to connect. Instead, let me try describing exactly what I would advice I would give young people opening an Internet connection for the first time:
I hope something like this is more useful than any of the three naive positions I mentioned earlier. You're currently a free subscriber to Astral Codex Ten. For the full experience, upgrade your subscription. |
Older messages
Declining Sperm Count: Much More Than You Wanted To Know
Friday, February 17, 2023
...
Contra Kavanaugh On Fideism
Tuesday, February 14, 2023
...
Ro-mantic Monday 2/13/23
Tuesday, February 14, 2023
...
Open Thread 263
Monday, February 13, 2023
...
Links For February 2023
Thursday, February 9, 2023
...
You Might Also Like
China has utterly pwned 'thousands and thousands' of devices at US telcos [Tue Nov 26 2024]
Tuesday, November 26, 2024
Hi The Register Subscriber | Log in The Register Daily Headlines 26 November 2024 US China tech trade war China has utterly pwned 'thousands and thousands' of devices at US telcos Senate
What A Day: Hindsight is 2024
Tuesday, November 26, 2024
The Harris campaign leadership speaks out for the first time on what went wrong. ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏
What the Tweens Actually Want
Tuesday, November 26, 2024
Plus: What Neko Case can't live without. The Strategist Every product is independently selected by editors. If you buy something through our links, New York may earn an affiliate commission.
Dr. Oz Shilled for an Alternative to Medicare
Monday, November 25, 2024
Columns and commentary on news, politics, business, and technology from the Intelligencer team. Intelligencer politics Dr. Oz Shilled for an Alternative to Medicare Trump's pick to oversee the
7 button-ups we love
Monday, November 25, 2024
Plus: A deal on a very giftable robe View in browser Ad The Recommendation Ad Our favorite button-ups A view of the torsos of two people wearing button-up shirts with their hands in the pockets of
Tuesday Briefing: Trump’s criminal cases likely to be dismissed
Monday, November 25, 2024
Plus, a possible cease-fire deal in Lebanon. View in browser|nytimes.com Ad Morning Briefing: Asia Pacific Edition November 26, 2024 Author Headshot By Justin Porter Good morning. We're covering a
Organ Grinder
Monday, November 25, 2024
Your Aging Parts, Robots Advance ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏ ͏
Ready For Master Plan Season Two?
Monday, November 25, 2024
We are ready to start Master Plan season two, which will be just as powerful as season
Five new startups to watch
Monday, November 25, 2024
Former Amazon Care leader's startup provides virtual support for caregivers | SparkToro co-founder launches game studio ADVERTISEMENT GeekWire SPONSOR MESSAGE: Get your ticket for AWS re:Invent,
☕ Rage against the returns
Monday, November 25, 2024
Retailers take steps to curb returns. November 25, 2024 Retail Brew Presented By Bloomreach It's the last Monday before Black Friday, and Chili's just released a line of bedding products that