The Standing Invitation

Posts Tagged ‘Taleb

We Are Bad At Science

with 2 comments

One of the problems scientists face in the lab is that human beings are really, really bad at science. There are fundamental mechanisms in the way we think that help us enormously in our everyday lives, but are the exact opposite of what a scientist needs to understand the world.

Let’s play a game.

I have a rule in my head that generates numbers, three at a time. What I’ll do is give you the first three numbers. You have to guess what the next three numbers generated by the rule will be. You will ask me “Are the next three numbers x, y, and z?” and I will say yes or no. And we can repeat this process as many times as you like until you feel you know what the rule is. The game ends when you tell me what you think the rule is, and I tell you if you’re right or wrong.

All set? Okay, let’s go.

The first three numbers are 2, 4, 6.

Which numbers do you think come next? Unfortunately we can’t play this live, but you might try it with a friend afterwards; in the meantime, I’ll provide a transcript of the time I played this game with my old friend Clint.

The SI: “The first three numbers are 2, 4, 6.”

Clint: “Are the next three numbers 8, 10, 12?”

The SI: “Yes.”

Clint: “Are the next three numbers 14, 16, 18?”

The SI: “Yes.”

Clint: “Then 20, 22, 24, 26, 28…”

The SI: “Yes, to all.”

Clint: “Okay, then I know what the rule is. The rule is, generate the  next number by adding 2 to the last one. That’s it, right?”

Actually, it doesn’t matter. Clint might well be right: his theory does account for all the observed data, and has successfully predicted future results. But he has approached the problem in completely the wrong way. Yes, the n+2 rule is a viable one. But so is “the numbers always go up”. If that had been the rule, then if Clint had gone on to ask “29, 30, 31”, he would have been correct; but Clint didn’t ask that. Most people don’t.

What Clint did ­– what most people do – is come up with a theory and seek confirmation. This is a good rule of thumb that works most of the time, but it is fragile and dangerous because it can lead you in the wrong direction. You might go a considerable distance, wasting time and precious resources, before realising you’ve had it wrong all along.

The correct way to approach the problem, indeed all problems in science, is to seek disconfirmation. You have an idea: how do you prove it wrong? What are the minimum standards of scrutiny to which your theory must be subjected? What wringer can you put it through to see if it comes out unharmed?

This is how science is done, but it is not a natural way of thinking: our brains just aren’t wired up to work in this way. We try our best, but often get fooled; and if you take it out of the context of a lab and present it as just a game about numbers, it’s amazing to see how badly we fare.


Taleb, The Black Swan

Written by The S I

September 15, 2011 at 11:59 pm

The Set of Everyone

leave a comment »

A few years ago, the abdominal ticking timebomb that was my appendix started to rebel against the system, and I was rushed to hospital. The offending organ was removed, and my life was saved. When I left the hospital, I did not receive a bill; the doctors didn’t give me a receipt for my bursting bits. Thanks to my country’s healthcare being publically funded, I was spared the unpleasant knowledge of the exact monetary value of my life.

But in a very real sense, I did pay for the operation. The tens of thousands of pounds my surgery cost were taken from me, a bit at a time, through taxation. My kamikaze appendix represented a return on my investment, since from birth to the moment I walked into the doctor’s office, this was money lost.

If the illness justifies the money I lost, would I have considered myself cheated if I never once got sick?

The problem is that we never know in advance that we will get ill. The money deducted from my income was spent on a probability. How likely am I to fall ill in the next year? In the next ten years?

A hypothetical situation: you know for certain that there is a fifty percent chance of catching a fatal illness in the next ten years. The illness is treatable, but the payment is expensive ­– and you have to pay it in advance.

You have two choices: you don’t pay, and hope you get lucky; or you do pay, but risk having spent your money on nothing. You decide to pay, just to be on the safe side. But ten years later, you are still in perfect health. You were lucky. Do you feel cheated? If yes, it is only because you are unaware of the other you, the probabilistic ghost of you, that could have been you.

Your decision to pay for treatment could be seen as paying for a probability, but it can also be seen as paying for a certainty ­­– the certainty of health for the set of all possible yous.

In order to act self-interestedly in an uncertain world, you need to consider not just who you are now, but who you might be later – and, by extension, all the people you might have been. It’s impossible to know which of the set of possible yous you will be in ten years time; it is in your interest to have a healthcare system that will take you in, whoever you happen to be.

I have no children, but someday I would like to. I have no idea what they will be like: boys or girls, healthy or unhealthy. But I know that I would want them to be born into a world where they are taken care of regardless of what bodies they were born into. I consider it the mark of a functioning society that the set of all possible mes, the set of all possible people everywhere, is looked after. It’s looking after Number One; but it’s doing so recognising that Number One being bigger than you are.


This is all very Rawlsian; google the Veil of Ignorance. The ideas of probability and certainty are derived from Taleb’s wonderful The Black Swan.

Written by The S I

August 15, 2011 at 11:59 pm