- 15 Non-Certified IT Skills Growing in Demand
- How 19 Tech Titans Target Healthcare
- Twitter Suffering From Growing Pains (and Facebook Comparisons)
- Agile Comes to Data Integration
Network World - To borrow from John Lennon: Imagine there's no latency, no spam or phishing, a community of trust. Imagine all the people, able to get online.
This is the kind of utopian network architecture that leading Internet engineers are dreaming about today.
As they imagine the Internet of 2020, computer scientists across the country are starting from scratch and re-thinking everything: from IP addresses to DNS to routing tables to Internet security in general. They're envisioning how the Internet might work without some of the most fundamental features of today's ISP and enterprise networks.
Their goal is audacious: To create an Internet without so many security breaches, with better trust and built-in identity management. Researchers are trying to build an Internet that's more reliable, higher performing and better able to manage exabytes of content. And they're hoping to build an Internet that extends connectivity to the most remote regions of the world, perhaps to other planets.
This high-risk, long-range Internet research will kick into high gear in 2010, as the U.S. federal government ramps up funding to allow a handful of projects to move out of the lab and into prototype. Indeed, the United States is building the world's largest virtual network lab across 14 college campuses and two nationwide backbone networks so that it can engage thousands – perhaps millions – of end users in its experiments.
"We're constantly trying to push research 20 years out," says Darleen Fisher, program director of the National Science Foundation's Network Technology and Systems (NeTS) program. "My job is to get people to think creatively potentially with high risk but high payoff. They need to think about how their ideas get implemented, and if implemented how it's going to [affect] the marketplace of ideas and economics."
The stakes are high. Some experts fear the Internet will collapse under the weight of ever-increasing cyber attacks, an increasing demand for multimedia content and the requirements for new mobile applications unless a new network architecture is developed.
The research comes at a critical juncture for the Internet, which is now so closely intertwined with the global economy that its failure is inconceivable. As more critical infrastructure — such as the banking system, the electric grid and government-to-citizen communications — migrate to the Internet, there's a consensus that the network needs an overhaul.
At the heart of all of this research is a desire to make the Internet more secure.
"The security is so utterly broken that it's time to wake up now and do it a better way," says Van Jacobson, a Research Fellow at PARC who is pitching a novel approach dubbed content-centric networking. "The model we're using today is just wrong. It can't be made to work. We need a much more information-oriented view of security, where the context of information and the trust of information have to be much more central."
Futuristic Internet research will reach a major milestone as it moves from theory to prototype in 2010.
NSF plans to select anywhere from two to four large-scale research projects to receive grants worth as much as $9 million each to prototype future Internet architectures. Bids will be due in the first quarter of 2010, with awards expected in June.
"We would like to see over-arching, full-scale network architectures," Fisher says. "The proposals can be fairly simple with small, but profound changes from the current Internet, or they can be really radical changes.''
NSF is challenging researchers to come up with ideas for creating an Internet that's more secure and more available than today's. They've asked researchers to develop more efficient ways to disseminate information and manage users' identities while taking into account emerging wireless and optical technologies. Researchers also must consider the societal impacts of changing the Internet's architecture.
NSF wants bidders to consider "economic viability and demonstrate a deep understanding of the social values that are preserved or enabled by whatever future architecture people propose so they don't just think as technicians," Fisher says. "They need to think about the intended and unintended consequences of their design."
Key to these proposals is how researchers address Internet security problems.
"One of the things we're really concerned about is trustworthiness because all of our critical infrastructure is on the Internet," Fisher says. "The telephone systems are moving from circuits to IP. Our banking system is dependent on IP. And the Internet is vulnerable."
NSF says it won't make the same mistake today as was made when the Internet was invented, with security bolted on to the Internet architecture after-the-fact instead of being designed in from the beginning.
"We are not going to fund any proposals that don't have security expertise on their teams because we think security is so important," Fisher says. "Typically, network architects design and security people say after the fact how to secure the design. We're trying to get both of these communities to stretch the way they do things and to become better team players."
The latest NSF funding is a follow-on to the NSF's Future Internet Design (FIND) efforts, which asked researchers to conduct research as if they were designing the Internet from scratch. Launched in 2006, NSF's FIND program has funded around 50 research projects, with each project receiving $500,000 to $1 million over three to four years. Now, the NSF is narrowing these 50 research projects down to a handful of leading contenders.
The Internet research projects chosen for prototyping will run on a new virtual networking lab being built by BBN Technologies. The lab is dubbed GENI for the Global Environment for Network Innovations.
The GENI program has developed experimental network infrastructure that's being installed in U.S. universities. This infrastructure will allow researchers to run large-scale experiments of new Internet architectures in parallel with -- but separated from -- the day-to-day traffic running on today's Internet.