Sunday, January 11, 2015

It's the 1920s

And you are deciding what research to fund. Your decision is heavily weighted in favor of promoting the growth of important technology, with the (possibly erroneous) idea that this will promote human well-being. What do you fund? What will have the largest payoff? In retrospect, the answer is pretty clear: you would fund logic work on the foundations of mathematics. Without Frege, Russell-Whitehead, Hilbert, Godel, Turing, von Neumann there would be no computer, no world wide web, no computational biology or neuroscience etc. Here is a history of this period that I just finished reading (what is it with those Dysons?). It is not too hard to imagine that the fundamental work that made our modern world possible (and here Godel and Turing are the lynchpins) would have been considered too irrelevant and recondite to be taken seriously by todays science bureaucrats. But without this work on the foundations of mathematics modern computation (computers, web, apps, you name it) would not exist today. So the next time the NSF or NIH or whoever asks you about the wider relevance of your work (actually asking for techno payoffs in a roundabout way) remind them how hard it is to predict the relevance of basic research and remind them that the payoffs can be bigger than it is possible to imagine, let alone envisage.


  1. The Golden Goose awards were established to highlight exactly this point.

    I don't know about the other folks in this list, but wasn't the importance of Turing's work appreciated fairly quickly? That's why he was recruited for cryptography even before WWII started.

    1. Yes the importance of his work for code cracking was recognized. I don't think that anyone anticipated what his paper on computation would lead to. From what I can gather from the Dyson book, the real push towards computers came from building the H-bomb. They needed the computing power to simulate the blasts. What we could consider the great technological contributions, the web, apps, etc were beyond conception. Indeed, the lag time between Turing's original paper and our computing period was about 60 years. From little intellectual acorns massive oaks do grow. Thx for the golden goose link.