The Speculist: The End of Theory?

logo.jpg

Live to see it.


« Our Philanthropist Future | Main | Reader's Choice Video 7 »


The End of Theory?

Chris Anderson suggests that it's time to chuck the scientific method in favor of a new methodology that serves up facts the way Google serves up ads -- through calculations on massive sets of data:

But faced with massive data, this approach to science — hypothesize, model, test — is becoming obsolete. Consider physics: Newtonian models were crude approximations of the truth (wrong at the atomic level, but still useful). A hundred years ago, statistically based quantum mechanics offered a better picture — but quantum mechanics is yet another model, and as such it, too, is flawed, no doubt a caricature of a more complex underlying reality. The reason physics has drifted into theoretical speculation about n-dimensional grand unified models over the past few decades (the "beautiful story" phase of a discipline starved of data) is that we don't know how to run the experiments that would falsify the hypotheses — the energies are too high, the accelerators too expensive, and so on.

There is now a better way. Petabytes allow us to say: "Correlation is enough." We can stop looking for models. We can analyze the data without hypotheses about what it might show. We can throw the numbers into the biggest computing clusters the world has ever seen and let statistical algorithms find patterns where science cannot.

I think there's a lot to be learned from statistical analysis of data in the cloud, but I'm not sure that theory and models can be put away so quickly. There has to be a framework of questions we are asking, and we need to interpret the data once we have it. The theory may be moved to the algorithms or the interpretive methodology, but it still has to be in there somewhere.

Comments

Agreed. At the very least theory will inform scientists as to what questions to ask of the data.

But I think Anderson is right that this represents a fundamental shift in how science is done.

Arthur C. Clarke's distinguished but elderly scientist has inadvertently slowed progress in the past. You can't blame them really. Like everything else, science has limited resources. Those distinguished old guys want to make sure to spend those resources wisely. They don't won't to waste effort on fool's errands.

But Petabyte power reduces the cost of inquiry to the point that it won't hurt to ask "foolish" questions. The distinguished old guys won't be able to slow progress, but their power to contribute (through good theory) will be amplified.

If, for example, you think that treating mitochondrial damage could extend life, maybe a huge double-blind study isn't necessary to get a rough answer. Just consult the data. If the answer is positive, then you set up the huge double-blind study.

This reminded me of the Stargate SG1 Asgard's inability to think of lowly (human-like) solutions due to a history of powerful computing. In that case, it took the relatively dumb idea of chemical projectile weapons to prevail where high-energy beam weapons failed.

If we've always managed to leverage reasoning where brute-force power was unavailable, then increased power should give elegant reasoning even greater utility...

Post a comment

(Comments are moderated, and sometimes they take a while to appear. Thanks for waiting.)






Be a Speculist

Share your thoughts on the future with more than

70,000

Speculist readers. Write to us at:

speculist1@yahoo.com

(More details here.)



Blogroll



Categories

Powered by
Movable Type 3.2