WASHINGTON, D.C. - People use the telephone network for voice and some IP data; they use the Internet for data and some voice. There will be one network someday. But today, there's considerable debate on what that next-generation network will be - an outgrowth of the public switched telephone network or the Internet.
That debate took center stage in Washington last week at the Next Generation Networks 2005 conference, an annual gathering of industry players and pundits.
"There's considerable uncertainty right now on what the next-generation network will look like," said Dave Passmore, conference chair and research director of Burton Group.
That uncertainty creates standoffs between standards organizations such as the International Telecommunication Union (ITU) and the IETF. It tries to take into account the effect of wireless, video, message-based routing and peer-to-peer applications on foundational IP routing structures that were developed when those technologies could scarcely be imagined. The uncertainty prompts questions about whether the Internet should be rebuilt using a "clean slate."
But most critical of all is that the uncertainty could potentially reach in to the pockets of carriers and their customers. The direction of next-generation networks could determine how tightly customers are tied to carriers, and who will be forced to reinvent their business models to stay alive.
"We need to acknowledge the fact that something is wrong with the business model of the public network," says Tom Nolle, president of consultancy CIMI Corp.
Passmore added, "Providers want to take back control of the Internet. They feel it's out of control."
Some consider the IP Multimedia Subsystem(IMS) architecture, endorsed by the telecom-heavy ITU for next-generation networks, as the key for operators to regain that control. Conceived by the Third Generation (3G) Partnership Project - a collaboration of telecom standards bodies initially chartered to define specifications for 3G mobile wireless systems - IMS essentially replaces the control infrastructure in the traditional circuit-switched telephone network, separating services from the underlying networks that carry them.
IMS enables services, such as text messaging, voice mail and file sharing, to reside on application servers anywhere and be delivered by multiple wired and wireless service providers.
Yet some are skeptical of IMS. While it enables the migration of the PSTN to IP while maintaining telephony-borne features such as emergency services, wiretaps, call handoff and billing, critics - such as the IETF - say it gives carriers too much control over the customer experience.
"We see an attempt to completely control quality of service by IMS," said Scott Brim, Cisco senior consulting engineer, who spoke at the conference on behalf of the IETF. "The IETF is concerned about this."
One large corporate user is not.
"We're looking specifically at what they mean by control," says the user, a director of network architecture at a $40 billion company, who asked to remain anonymous. "But I do think IMS is a good direction. I want to know what carriers are doing" technologically.
IMS skeptics described the architecture as a "walled garden" around customers. Proponents said it provides a secure, reliable, high-quality service experience for customers.
The Internet, on the other hand, provides only "best effort" QoS, is prone to security breaches, denial-of-service and other attacks, and is generally less reliable than the PSTN. This prompted discussion among academics and researchers at the event as to whether the Internet should be rebuilt from scratch.
The most important reason to rethink the Internet is security, said David Clark, senior research scientist at the Massachusetts Institute of Technology.
"We are suffering a success disaster," Clark said, referring to the popularity and ubiquitous use of the Internet. "The first question is, 'isn't today's network good enough?' Which applications can you not build because of today's Internet? We do not have framework, architecture or a set of rules. We need to focus on resistance to attack and resiliency in the face of attack."
Reliability is right up there with security as a reason to re-architect the Internet, according to Larry Peterson, professor and chair of computer science at Princeton University. The industry norm for reliability is five nines - 99.999% reliable - but the Internet is "a long way from five nines," he said.
Yet incremental change might be as hard as an overhaul because individuals and organizations that rely on the Internet have become too accustomed to the way it currently operates, Peterson suggested. Flexibility might even be too disruptive to these users.
"The Internet has become ossified," he said.
Peterson recommended the industry consider the Global Environment for Network Investigations (GENI) as a model for facilitating change in the 'Net. GENI, a component of the National Science Foundation's Future Internet Design Initiative (FIND), is a prototype network for deploying research innovations conceived by FIND participants.
GENI is designed to support both research and deployment, which helps bridge small-scale lab experiments and commercial deployment. This might help loosen the Internet and its users to embrace potential change, Peterson said.
"There's no competitive advantage to deploying a new architecture," Peterson said. "GENI supports experimental validation of new architectures."
To simulate real-world deployment, GENI supports virtualization and user opt-in. Virtualization allows a GENI to be partitioned into slices that run a given service or architecture. Users can opt-in to various services and applications on a per-user, per-application basis, which makes it possible to attract users necessary to validate new designs.
Attraction to GENI is the challenge, others say.
"How is the best way to get industry involved with GENI?" asks Paul Mockapetris, chairman and chief scientist of IP address management firm Nominum and creator of the Internet's DNS. "Can we figure out what might be the raw material of the future Internet?"
The last point is crucial because those who exploit it now dominate Internet innovation.
"The bad guys are being hugely innovative, and if you don't believe that I encourage you to attend a DEFCON conference," Mockapetris says.
As the industry mulls its options for next-generation networks, the business models of network operators - those from the PSTN and the Internet - hang in the balance. Business customers are beginning to substitute public Internet connectivity for frame relay, ATM and private lines as the performance of the Internet improves; wireless threatens to make wireline services and service providers obsolete; bandwidth-intensive video puts new strains on the public network; and peer-to-peer applications put more power and control into the hands of users.
"The disruptive impact of IP and digital multimedia is finally here," Passmore said. "Who owns the customer? How relevant will wireline operators be in the future? We're seeing the consolidation of previously separate markets."