The Standing Invitation

Posts Tagged ‘Uncertainty

Arrogance and Respect

with 6 comments

Is it always arrogant to tell people they are wrong?

Arrogance is a common accusation levied at scientists when they declare that, actually, x is the case, and anyone who believes otherwise is simply incorrect. And, yes, there is an arrogance associated with any kind of absolute certainty, since there is very little, if anything, of which anyone can be truly certain. This is something that scientists know (or should know) better than anyone. Scientists are generally very candid about what they do not know, where their areas of expertise are and what lies outside it, and their claims are always tempered by error bars, confidence levels, and the fact that correlation does not imply causation; and that’s before you get down to the real philosophy of science stuff with the problem of induction, unreliability of the evidence of senses and so on.

Nevertheless, there are some things about which scientists’ feelings come so close to certainty that there isn’t much reason calling it anything else – certainty in the existence of atoms, or that the Earth is an oblate spheroid orbiting a main-sequence star, or that humans and broccoli share a common ancestor. The evidence of these things is overwhelmingly good, and anyone who believes otherwise is wrong.

But is it arrogant to say so?

The concept of arrogance is bound inextricably to the idea of respect: to be arrogant is to not respect another person’s opinions.

Now I’m just going to come right out and say it: some people’s opinions are pretty dumb. The idea that the Earth is 6000 years old deserves no respect whatsoever. But then, neither does the idea that the Earth is 4.5 billions years old. No opinion deserves respect, or protection from criticism.

But is disrespecting an opinion the same as disrespecting the person who holds it? Sometimes, if it’s not done properly. And here lies the meaning of arrogance.

Respect, as applied to an intellectual, means that, if this person says something that is totally opposed to your own opinions, you still listen to hear what she has to say. It’s tempting to dismiss people who say that trial by jury should be abandoned; but when Richard Dawkins says it, I sit up and pay attention, because I know he’s thought hard about it. I respect the man, and so I listen.

To respect someone means to assume that his opinion is founded on careful thought that is worth taking on board; it also means to assume that he is amenable to rational argument, and is not so inflexible that he cannot be persuaded otherwise, if he is wrong. One should always make this assumption, and frame one’s arguments as though to someone who will listen to them; if nothing else, it is good exercise. To treat one’s opponent as unreachable by logical discourse is arrogant in the extreme.

So next time you see a conversation in which one debater calls the other arrogant, ask yourself this question: who is showing the least respect? The one who is hears a deeply-held belief and demands evidence for it? Or the one who’s deploying the A-word as a get-out-of-argument-free card and hoping to stop the debate in its tracks?

REFERENCES

Actually, Dawkins makes good points about trial by jury. Worth reading.

Advertisements

Written by The S I

October 29, 2011 at 11:59 pm

Quieten Down

with 5 comments

I dislike noise. When I go to a pub I go there to listen to people, and there is nothing I hate more in a night out than not being able to hear what they are saying.

When you listen to someone speak, you are trying to detect with your ears the audible signals they produce with their mouths; noise is everything you hear that is not a signal. How well you can hear a person depends on how clearly their words stand out against the background: the signal-to-noise ratio.

Signal-to-noise ratios appear everywhere in science where a precise measurement must be taken. In order to understand measurement ­– in order to comment on what it is we can ever hope to know about the world – we have to have a working knowledge of the properties of noise.

Now noise from a Shannon-information perspective is anything that disrupts a flow of information, but generally speaking it is useful to separate noise into two different types: intrinsic noise (or thermal noise), and extrinsic noise (or interference).

Let’s say you want to use a microphone to measure some very faint sound – the sound of an ant chewing a leaf, say. Plug in your headphones and the first thing you’ll hear will be a your neighbour’s washing machine, or traffic on the street outside. This is extrinsic noise and can be reduced by shielding. People build anechoic chambers to reduce extrinsic noise: carefully insulated and coated with foam pads to deaden echoes, they offer some of the quietest places on Earth. According to John Cage, his piece 4’33” was inspired by his experience in an anechoic chamber. Expecting to hear nothing but peaceful silence, he was surprised to hear

two sounds, one high and one low. Afterward I asked the engineer in charge why, if the room was so silent, I had heard two sounds. He said, ‘Describe them.’ I did. He said, ‘The high one was your nervous system in operation. The low one was your blood in circulation.’

What Cage experienced was intrinsic noise, noise that no soundproofing can remove because it originates inside the thing doing the measurement. In his case, the noise came from inside his body, but even an electronic microphone will hiss and crackle from the random motion of electrons in its wiring.

No analytical tool is safe from intrinsic noise, not even a simple ruler, whose length fluctuates randomly on a scale too small for us to notice, but is sufficient to preclude its use for measuring things on the molecular scale. This is all because everything, ultimately, is made of particles that are always in motion ­– tiny, incessant, random movement caused by the ambient temperature.

So that background hiss interfering with your measurement will never go away. It is also totally random ­– and yet, as a consequence of this randomness, it is in some ways completely predictable. Some maths shows, for example, you can increase the accuracy of your measurement simply by taking lots of measurements and averaging them out: the more measurements you take, the more your signal stands out against the background noise. More exactly, if you take four times as many measurements, your signal-to-noise ratio doubles.

But measurements can be costly; noise can never be removed completely; and there is a law of diminishing returns. If a measurement with precision x costs you £10, twice as much precision would cost £40; twice as much again would cost £160. Sometimes the size of the error bars depends on how much you can afford.

REFERENCES

http://cm.bell-labs.com/cm/ms/what/shannonday/shannon1948.pdf

http://www.physics.utoronto.ca/~phy225h/experiments/thermal-noise/Thermal-Noise.pdf

http://www.lichtensteiger.de/anechoic.html

Written by The S I

October 9, 2011 at 11:59 pm

The Set of Everyone

leave a comment »

A few years ago, the abdominal ticking timebomb that was my appendix started to rebel against the system, and I was rushed to hospital. The offending organ was removed, and my life was saved. When I left the hospital, I did not receive a bill; the doctors didn’t give me a receipt for my bursting bits. Thanks to my country’s healthcare being publically funded, I was spared the unpleasant knowledge of the exact monetary value of my life.

But in a very real sense, I did pay for the operation. The tens of thousands of pounds my surgery cost were taken from me, a bit at a time, through taxation. My kamikaze appendix represented a return on my investment, since from birth to the moment I walked into the doctor’s office, this was money lost.

If the illness justifies the money I lost, would I have considered myself cheated if I never once got sick?

The problem is that we never know in advance that we will get ill. The money deducted from my income was spent on a probability. How likely am I to fall ill in the next year? In the next ten years?

A hypothetical situation: you know for certain that there is a fifty percent chance of catching a fatal illness in the next ten years. The illness is treatable, but the payment is expensive ­– and you have to pay it in advance.

You have two choices: you don’t pay, and hope you get lucky; or you do pay, but risk having spent your money on nothing. You decide to pay, just to be on the safe side. But ten years later, you are still in perfect health. You were lucky. Do you feel cheated? If yes, it is only because you are unaware of the other you, the probabilistic ghost of you, that could have been you.

Your decision to pay for treatment could be seen as paying for a probability, but it can also be seen as paying for a certainty ­­– the certainty of health for the set of all possible yous.

In order to act self-interestedly in an uncertain world, you need to consider not just who you are now, but who you might be later – and, by extension, all the people you might have been. It’s impossible to know which of the set of possible yous you will be in ten years time; it is in your interest to have a healthcare system that will take you in, whoever you happen to be.

I have no children, but someday I would like to. I have no idea what they will be like: boys or girls, healthy or unhealthy. But I know that I would want them to be born into a world where they are taken care of regardless of what bodies they were born into. I consider it the mark of a functioning society that the set of all possible mes, the set of all possible people everywhere, is looked after. It’s looking after Number One; but it’s doing so recognising that Number One being bigger than you are.

REFERENCES

This is all very Rawlsian; google the Veil of Ignorance. The ideas of probability and certainty are derived from Taleb’s wonderful The Black Swan.

Written by The S I

August 15, 2011 at 11:59 pm