The innovation gap is real, all right

Recently Judy Estrin, former Cisco CTO and current Silicon Valley luminary, published a book called Closing the Innovation Gap. I haven’t read it (yet), but she reportedly argues that the United States has what she calls a “national innovation deficit” — specifically, a shortage of overall investment in science and engineering. According to a recent article, Google’s Vint Cerf, one of the primary inventors of the Internet, agrees.

Recently Judy Estrin, former Cisco CTO and current Silicon Valley luminary, published a book called Closing the Innovation Gap. I haven't read it (yet), but she reportedly argues that the United States has what she calls a "national innovation deficit" — specifically, a shortage of overall investment in science and engineering. According to a recent article, Google's Vint Cerf, one of the primary inventors of the Internet, agrees.

They're right. In 2005, the National Academies noted that federal financing of research in the physical sciences was 45% less in 2004 than in 1976. More recently, according to a report released in June from the American Society for Engineering Education (ASEE), engineering bachelor's and master's degrees are on the decline — particularly among electrical engineering and computer science.

Some argue that this isn't a problem, given the robust structure of venture capitalism in Silicon Valley and elsewhere.

They're wrong. It's not commonly acknowledged, but federal investment has been key to the dramatic growth in technology innovation in the '70s, '80s and '90s. The Internet itself grew out of federally-financed projects: Both the Defense Advanced Research Projects Agency, or DARPA, and the National Science Foundation funded the research and engineering that went into its design (as late as the early 1990s, the NSF was funding the Internet backbone to the tune of $10 million per year).

Moreover, the much-vaunted Silicon Valley machine is actually, in the simplest terms, a mechanism for transforming public investment dollars into personal profits. Here's how: In the '60s, '70s and even into the '80s the feds funded universities and other not-for-profit groups to do long-term "pure" research. If and when researchers uncovered potentially profitable ideas, they were wooed into start-ups or established businesses, where they converted their ideas into products or companies.

This model worked well, but declining funding in academia, coupled with increased opportunities for fame and fortune in industry, led to a mass exodus of scientists and engineers from academia in the 1980s and 1990s (including yours truly). Now, not only is there limited funding available — there are increasingly fewer researchers to take advantage of what exists.

At the same time, Wall Street began to punish public companies (like Microsoft, IBM and AT&T) for continuing to invest in primary research, on the theory that such investments weren’t in the best interests of shareholders. (Fair enough. Companies exist to generate profits, not pure research.)

The upshot? The well is beginning to run dry. We're lacking both talent and investment dollars in the early stages of innovation — the place ideas are generated before they become investment-worthy.

The good news: We know how to fix this. Federal investment in primary scientific and engineering research works. And encouraging youngsters to get engineering degrees is a win-win: not only does it improve their chances for obtaining interesting and lucrative work, it helps the United Strates maintain a competitive edge.

If you agree, let your representatives in Washington know how you feel. And tell your kids to study engineering.

Join the Network World communities on Facebook and LinkedIn to comment on topics that are top of mind.
Related:
Must read: 10 new UI features coming to Windows 10