In the last 80 years, the U.S. has become a powerhouse of science and technology, largely through concerted investment on the part of the government. In fact, that scientific investment drove the economy. But in recent years that investment has slipped, and many believe the U.S. is at risk of falling behind the rest of the world in scientific innovation, with potential repercussions for the economy.
In the last 80 years, the U.S. has become a powerhouse of science and technology, largely through concerted investment on the part of the government. In fact, that scientific investment drove the economy. But in recent years that investment has slipped, and many believe the U.S. is at risk of falling behind the rest of the world in scientific innovation, with potential repercussions for the economy.
Simon Johnson, the Kurtz Professor of Entrepreneurship at Massachusetts Institute of Technology (MIT) and former chief economist to the IMF, and Jonathan Gruber, the Ford Professor of Economics at MIT, recently wrote a book, “Jump-Starting America: How Breakthrough Science Can Revive Economic Growth and the American Dream.” One premise of the book is that in U.S. history, the country made a concerted effort, particularly at the beginning of World War II, to become a technological powerhouse, and for years it drove the economy. And the authors argue that although the U.S. is still the most scientifically advanced country in the world, in some areas it’s falling behind and in others is already behind.
In June 1940, shortly after Germany attacked The Netherlands, Belgium and France, it was obvious that Germany had done something different with their technology, allowing it to dominate the other countries in a very short period of time.
At the time, the U.S. was not involved in the conflict, and had “a competent but small navy, an air force that had fallen behind its potential adversaries, and an army that was so short of rifles that soldiers had to practice with brooms instead. In all of 1939, the United States build only six medium tanks.
And much of the military technology the U.S. did have was ineffective. But four years later, the U.S. “transformed warfare” through its technology—and through its approach to bacterial infections and malaria control.
In a recent interview with James Pethokoukis on AEIdeas, Gruber said, “What’s quite striking, and we have some examples in the book, are areas which are American born, American perfected, but that America has backed off of and other countries are taking the lead — most strikingly in medical research and biotechnology research. Leading experts predict that within 10 years, we will fall behind China in what has been a U.S. dominated area.”
Saying that the intent isn’t to bash China just to bash China, Gruber notes that China learned a major lesson from U.S. history, “that in area after area when it comes to fundamental scientific advancement, the private sector will not do the research involved, or the basic science involved. The public sector has to lead.”
For example, in the mid-1960s, Gruber says, the U.S. spent 2% of GDP financing research. Now, that’s fallen to 0.7% of GDP. Meanwhile, China, Gruber says, “is ramping up toward 2.5% of GDP.”
But Gruber also points out that it’s not just China. “I’d be somewhat worried if it was just us versus China, but it’s not just China. The U.S. used to be far and away the world leader in terms of government investment in basic science and R&D. We’re now barely in the top 10 in terms of share of GDP.”
Pethokoukis asks Gruber a question that many might wonder about. Noting that the government seems very good at funding things like getting to the moon, the atomic bomb and military technology, is it really as effective at funding science that isn’t so clear cut?
Gruber breaks his answer down into two parts. One is politics, where goals really matter. “I think goals do matter, and that’s where understanding what China is up to matters.”
In terms of actual investment, he doesn’t think goals matter as much. Obviously, when you look at World War II, there was a major national priority, but that’s not always the case. “If you look at some of the most important investments we’ve made in technology,” Gruber said, “in pharmaceuticals and health, they weren’t about some big national priority. At the right time, people just made smart decisions.”
For example, he brings up the Human Genome Project, where the federal government invested $3 billion over 13 years. It had a goal, but was not a national priority. But from an economic investment strategy, the U.S. dominates the genomics industry, which is responsible for 280,000 U.S. jobs with an average annual pay of $70,000. “And in one year alone,” Gruber said, “genomics-based companies pay $6 billion in taxes to the federal and state governments. That is twice our 13-year investment.”
But one thing Gruber notes is that after World War II was over, Vannevar Bush, a former Dean at MIT who convinced President Roosevelt to push science for the war effort, thought the U.S. should try to win the peace with science, and proposed the federal government continue science funding. It didn’t quite work out—politicians and scientists didn’t tend to see eye-to-eye when there wasn’t a clear-cut goal, like a war.
Or, as they write in their book, “If you speak truth to power, power will cut your funding.”
Priorities change, as well, first the Vietnam War, then the growth of Great Society programs, followed by Reagan’s anti-tax agenda, and most recently debt-ceiling battles. Gruber said, “The ramp-up of science funding is bipartisan, and the ramp down of science funding is bipartisan, and it continues to fall to this day.”
But rather than believe the U.S. needs some sort of catalyst—like fear of China becoming the dominant science player in the world—Gruber and Johnson believe there was one big mistake made, which wasn’t explained to the American public: “Science policy is economic policy. Investments in science create jobs and growth, and I think,” Gruber said, “we need to recognize that science investments are not just about competing with China and inventing cool stuff, but actually about renewing American growth back toward the levels we saw when we were pushing the technology frontier in a way we’re not doing today.”
So their proposal involves three steps. One is for the government so spend more money on science and technology research. The second is to spread it around the country, “recognizing the opportunities elsewhere in America besides the six coastal cities that have dominated economic growth in the last 30 years.”
And, in fact, they analyzed the country with a variety of criteria — 100,000 people of working age, at least 25% have a college education, and are affordable, with a housing price of less than $265,000 on average (which excludes the two biggest biotech centers in the U.S., Boston and San Francisco). They identified 102 places in 36 states with 80 million people that fit the criteria.
The third part of the policy is what they call an “innovation dividend.” This is because, for example, every single new pharmaceutical entrant from 2010 to 2016 was based on NIH-funded research, yet most of the returns don’t go to the U.S. taxpayer, they go to corporations and to highly compensated executives.
Gruber says, “Here’s how that would work: the government would capture some of the returns to this new technology, invest it in a fund, and then redistribute that to every American as a flat dividend, just like Alaska does with their oil revenues.”
Would it happen? Would it work?
History suggests it would work, at least, the investment would pay economic dividends that would be positive for the economy. Will it happen?
Given modern politics, that seems like much longer odds.