The Oscar-winning movie, The Imitation Game, opened with a scene that perfectly captured an age-old conflict between brilliant science and bureaucratic sensibilities.
In the movie, math and science genius Alan Turing professed no particular knowledge of the crisis at hand (cracking the German code). He suggested the Army general, in effect, trust him to bring to bear the power of ideas. The skeptical general was looking for a specific skill-set and path. To the general’s credit, he trusted Turing. (The outcome of World War Two would have been otherwise quite different.)
On average, history shows it is how science works. A lot of trust is needed to fund open-ended exploration. That kind of curiosity-driven, basic research has been the American hallmark for all of modern history. And ironically, even if unpredictable in advance, it has in hindsight been both revolutionary and productive.
Case in point.
Twenty-six years ago this March an American physicist had an idea that changed the world.
Scientists at the European Organization for Nuclear Research, or CERN, were struggling to find a way to link global teams to track complex research about the nature of subatomic particles. This is the kind of research that routinely fails the test politicians apply when deciding whether to open coffers for basic science: is it, as one policymaker recently said, “news you can use”?
To solve the information “sharing” problem for particle physicists, an American working at CERN wrote a memo in March 1989 detailing a revolutionary communications architecture. Tim Berners-Lee would, one year later, name that design the World Wide Web.
The CERN research, it bears noting the obvious, was not funded to create the Internet, nor for that matter to do anything “useful” at all. As with so much basic research, it was about the foundational, open-ended, largely undirected, explorations of ideas.
It is in this kind basic research that America has been and remains the world leader — at least for now. Spending and leadership on such basic science in the U.S. has been eroding and is now at risk of being decimated from collateral damage in the on-going budget wars in Washington D.C. This is aggravated by the seemingly irresistible pressure to, as Science magazine recently put it, “accentuate the practical.” But what could be more practical than ensuring the very future of the American economy?
Edmund Phelps, winner of the 2012 Nobel in economics, lucidly summarized what many people believe but politicians have trouble accepting: “Modern economies … the visible ‘goods and services’ of the national income statistics are mostly embodiments of past ideas.” History is replete with examples.
The musings of mathematicians in the 17th, 18th, and 19th centuries led to the now ubiquitous tools of probability statistics, as well as the first computer algorithms. In 1905, with no particular interest in revolutionizing the energy industry, Albert Einstein discovered the photoelectric effect (for which he was awarded the 1921 Nobel Prize). Biologist Watson and physicist Crick conceived the structure of DNA in 1953 (receiving a Nobel in 1962); they weren’t seeking to develop a better way to improve either crops or the justice system. Researchers a half-century ago originally studying poisonous botulism weren’t pursuing treatment of neuromuscular diseases (or frowns). Chemist Edward Taylor’s curiosity about butterfly wings led directly to the development of a cancer therapeutic (the royalties from which funded a new building at Princeton University). The list goes on.
Of course it takes investments in applied research and yet more spending on development and deployment for basic ideas to become, to use Phelps language, “embodied” as innovative products and services. But the foundational and sometimes world-changing ideas themselves typically don’t emerge from research directly aimed at solving practical problems, whether pertaining to solar cells, specific diseases, smartphones, or self-driving cars.
The state and future of R&D in America is an enormous and complicated topic. You can see the recently published results of my foray down that rabbit hole for the Manhattan Institute here: Basic Research and the Innovation Frontier. In the meantime, permit me to summarize.
There’s no shortage of money going into either applied research or the “development” side of R&D. From Google and GM to Intel and Boeing, corporations spend collectively nearly $350 billion annually on such research. Less than 5 percent of that is for basic research.
By contrast, the federal government supplies 95 percent of all basic research funding. But that total comprises barely 10 percent of all R&D money spent in the economy. While it may be difficult, even impossible, to calculate the perfect ratio, there’s no doubt that basic science spending in America is on the decline. And the potential for radical decline is at hand as the pressure mounts for federal sources to “accentuate the practical.”
It is no surprise that government is where we find the most financial support for long-term and open-ended scientific inquiry. The modern roots for government support of basic research truly began contemporaneous with World War One and blossomed after the Second World War. Since then, the United States has been the world’s dominant idea factory, home to 34 of the world’s top 50 universities. American residents or citizens have been awarded more than half of all Nobel prizes in science, medicine, and economics. And America remains the world’s biggest economy, by a huge margin.
To justify precious budgets in times of fiscal crisis and political conflict, politicians retreat naturally to practicalities, near-term needs, and urgent problems. But the private sector already spends 400 percent more on near-term R&D than the federal government. This is not a competition one can or should try to win.
Federal funding of corporate-class R&D not only means that the government is, in effect, competing with markets, it also creates perverse disincentives for private spending. On the one hand, those who receive taxpayer money will spend less of their own. And on the other hand, the competition (those who didn’t “win” the federal largesse lottery) will often be reluctant to compete with the federal government and thus reduce private spending in the same area. A lose-lose both for taxpayers and for science.
In fact, during times of fiscal constraint the federal spending priorities should shift away from what private markets are willing to do (spend on applied R&D), and do what the private sector does less well, the basic R&D which is demonstrably foundational to the nation’s long-term future.
Would the basic physics research that led indirectly to Tim Berners-Lee 26 year old idea pass muster as worthy of pursuit in today’s political climate? What could be less “practical” than a handful of scientists connecting an even smaller handful of super-computers? Private markets did not and would not have funded the kind of basic research underway at CERN that hatched, however unexpectedly, the World Wide Web.
In the 19th century, when the great physicist Michael Faraday’s curiosity led to the discovery that a magnetic field could induce an electrical current in a wire (i.e., a motor or generator), popular history records that the British Chancellor of the Exchequer, on seeing the experiment, inquired, “Of what possible use [is] it?” To which Faraday is apocryphally reported to have quipped, “Why, one day, you shall tax it, sir.” Good thing the British Chancellor wasn’t deciding what research to fund at the Royal Society.
This article was written by Mark P. Mills from Forbes and was legally licensed through the NewsCred publisher network.