Against Shooting Yourself in the Foot
Somehow, someone is going to horribly misuse all the advice that is contained within this book.
Nothing I know how to say will prevent this, and all I can do is advise you not to shoot your own foot off; have some common sense; pay more attention to observation than to theory in cases where you’re lucky enough to have both and they happen to conflict; put yourself and your skills on trial in every accessible instance where you’re likely to get an answer within the next minute or the next week; and update hard on single pieces of evidence if you don’t already have twenty others.
I expect this book to be of much more use to the underconfident than the overconfident, and considered cunning plots to route printed copies of this book to only the former class of people. I’m not sure reading this book will actually harm the overconfident, since I don’t know of a single case where any previously overconfident person was actually rescued by modest epistemology and thereafter became a more effective member of society. If anything, it might give them a principled epistemology that actually makes sense by which to judge those contexts in which they are, in fact, unlikely to outperform. Insofar as I have an emotional personality type myself, it’s more disposed to iconoclasm than conformity, and inadequacy analysis is what I use to direct that impulse in productive directions.
But for those certain folk who cannot be saved, the terminology in this book will become only their next set of excuses; and this, too, is predictable.
If you were never disposed to conformity in the first place, and you read this anyway… then I won’t tell you not to think highly of yourself before you’ve already accomplished significant things. Advice like that wouldn’t have actually been of much use to myself at age 15, nor would the universe have been a better place if Eliezer-1995 had made the mistake of listening to it. But you might talk to people who have tried to reform the US medical system from within, and hear what things went wrong and why.1 You might remember the Free Energy Fallacy, and that it’s much easier to save yourself than your country. You might remember that an aspect of society can fall well short of a liquid market price, and still be far above an amateur’s reach.
I don’t have good, repeatable exercises for training your skill in this field, and that’s one reason I worry about the results. But I can tell you this much: bet on everything. Bet on everything where you can or will find out the answer. Even if you’re only testing yourself against one other person, it’s a way of calibrating yourself to avoid both overconfidence and underconfidence, which will serve you in good stead emotionally when you try to do inadequacy reasoning. Or so I hope.
Beyond this, other skills that feed into inadequacy analysis include “see if the explanation feels stretched,” “figure out the further consequences,” “consider alternative hypotheses for the same observation,” “don’t hold up a mirror to life and cut off the parts of life that don’t fit,” and a general acquaintance with microeconomics and behavioral economics.
The policy of saying only what will do no harm is a policy of total silence for anyone who’s even slightly imaginative about foreseeable consequences. I hope this book does more good than harm; that is the most I can hope for it.
For yourself, dear reader, try not to be part of the harm. And if you end up doing something that hurts you: stop doing it.
Beyond that, though: if you’re trying to do something unusually well (a common enough goal for ambitious scientists, entrepreneurs, and effective altruists), then this will often mean that you need to seek out the most neglected problems. You’ll have to make use of information that isn’t widely known or accepted, and pass into relatively uncharted waters. And modesty is especially detrimental for that kind of work, because it discourages acting on private information, making less-than-certain bets, and breaking new ground. I worry that my arguments in this book could cause an overcorrection; but I have other, competing worries.
The world isn’t mysteriously doomed to its current level of inadequacy. Incentive structures have parts, and can be reengineered in some cases, worked around in others.
Similarly, human bias is not inherently mysterious. You can come to understand your own strengths and weaknesses through careful observation, and scholarship, and the generation and testing of many hypotheses. You can avoid overconfidence and underconfidence in an even-handed way, and recognize when a system is inadequate at doing X for cost Y without being exploitable in X, or when it is exploitable-to-someone but not exploitable-to-you.
Modesty and immodesty are bad heuristics because even where they’re correcting for a real problem, you’re liable to overcorrect.
Better, I think, to not worry quite so much about how lowly or impressive you are. Better to meditate on the details of what you can do, what there is to be done, and how one might do it.
- As an example, see Zvi Mowshowitz’s “The Thing and the Symbolic Representation of The Thing,” on MetaMed, a failed medical consulting firm that tried to produce unusually high-quality personalized medical reports. ↩