This assumption--that understanding a system’s constituent parts means we also understand the causes within the system--is not limited to the pharmaceutical industry or even to biology. It defines modern science.


The problem with this assumption, however, is that causes are a strange kind of knowledge. This was first pointed out by David Hume, the 18th-century Scottish philosopher. Hume realized that, although people talk about causes as if they are real facts--tangible things that can be discovered--they’re actually not at all factual. Instead, Hume said, every cause is just a slippery story, a catchy conjecture, a “lively conception produced by habit.” When an apple falls from a tree, the cause is obvious: gravity. Hume’s skeptical insight was that we don’t see gravity--we see only an object tugged toward the earth. We look at X and then at Y, and invent a story about what happened in between. We can measure facts, but a cause is not a fact--it’s a fiction that helps us make sense of facts.

The truth is, our stories about causation are shadowed by all sorts of mental shortcuts. Most of the time, these shortcuts work well enough. They allow us to hit fastballs, discover the law of gravity, and design wondrous technologies. However, when it comes to reasoning about complex systems--say, the human body--these shortcuts go from being slickly efficient to outright misleading.


While correlations help us track the relationship between independent measurements, such as the link between smoking and cancer, they are much less effective at making sense of systems in which the variables cannot be isolated. Such situations require that we understand every interaction before we can reliably understand any of them.

The trouble with science is that people are the ones doing it. Any time anyone tells you "it's science!" you need to replace that, mentally, with "it's *scientists*" -- especially when political policy is involved. Via Trials and Errors: Why Science Is Failing Us | Wired Magazine |