Access wars

Last year, the Associated Press reported that a test of BitTorrent peer-to-peer traffic on Comcast’s network showed that the cable giant was somehow interfering with P2P flow. This year, there’s a story that AT&T and Verizon have at least looked at restraining P2P. Whatever the truth in these rumors, the industry is surely poised on the edge of the most important debate in the history of the Internet. Sadly, like most political debates, this technical one probably won’t expose too many facts.

P2P is not the problem that Internet access providers face, it’s a symptom of a larger problem, which is how we can fund the kind of public data and content services we all expect in the future. Despite the public expectation that the Internet is going to keep getting faster and more powerful, the future of public networking is a bit murky, and the industry itself is the root cause.

Data traffic of any sort isn’t a steady stream, it’s a series of bursts, and most consumer traffic bursts flow mostly out of the network to the customer and not the other way around. Consumer data access has long taken advantage of this, creating more capacity downstream than upstream. DSL and cable services today are asymmetrical in this sense, typically offering five or more times the capacity in downstream. DSL and cable services also expect that customers won’t be using all of the capacity of their access connection all of the time. Cable customers share bandwidth with all the others on the cable span, and DSL customers share capacity with others on their fiber remote. These sharing assumptions combine to create affordable broadband access by spreading the cost of fiber bandwidth across many homes. The result is that a homeowner can get anywhere from 6M to 30Mbps of bandwidth for less than one-tenth the cost per bit of dedicated, symmetrical, full-time broadband services of the type corporations buy.

The problem with sharing is that not everybody shares nice. Any user who operates outside the framework of typical access that operators have used to design their networks will put the design at risk, and with it the quality of service that users overall can expect. Five years ago, I dropped cable modem data services because some users on my span were serving data in high volume, and so congesting the uplink that service was impacted for everyone — including me. The reason that all broadband operators have usage agreements in their terms of service is that some usage patterns cannot be sustained without either reducing service quality for others or invalidating the economic assumptions of the network. The cable networks, which share capacity to a greater degree than most of the telcos do, are especially vulnerable to P2P, so it’s not surprising the problem appeared here first.

But it’s not just P2P, as I said before. If everybody starts streaming videos, we can congest even the faster downlink to the consumer. Most network users recognize that there is a performance trough when school lets out and kids return home to game or stream video. Unlike the problem of P2P, the problem of content streaming or downlink overload is created not by individual user decisions but by corporate marketing. Every company built to deliver personal content supported by ads over the Internet is encouraging consumption of bandwidth in their business model, but not creating it. So is every company that uses rich content on Web pages, delivers software via download or sells online music.

Advertisement

Is it wrong to want these things? No, but neither is it right to expect they’ll happen automatically. There are all kinds of public pressures to somehow promise that the Internet will do more without costing more, but what do you think would happen if the government told Ford or GM to sell new cars for twenty bucks? There’d be no cars sold, because none would be produced. The same holds for bits. If access providers can’t produce them at a reasonable profit, we won’t see much bit sales either.

The Internet is an ecosystem that contains consumers and producers of everything — from content to bandwidth. All players have to be rewarded according to their participation or they won’t play their role, and the P2P debate is an indication that the access providers that hold the key to all of our online experiences are becoming uncomfortable with their niche in the food chain.

Who’s at fault for the problem? If there’s a simple answer, it’s Wall Street. An online ad market that was worth $2B a year would be 1/15th of Comcast’s revenue, 1/50th of Verizon’s, and 1/60th of AT&T’s. It would probably not move their stocks much at all. On the other hand, that 2 billion would launch another Google. Investors would rather have the latter, even if it means that the dynamic between over-the-top and infrastructure players falls out of balance. Even if it creates the same kind of bubble mindset that devastated the industry seven years ago.

Will we have another bubble, a fallout between the access players and the rest of the market that will strangle broadband for a decade? The way that the P2P and “reasonable usage” debates play out will probably decide.

Learn more about this topic

Comcast's defense of P2P traffic management practices meet skepticism 

Join the Network World communities on Facebook and LinkedIn to comment on topics that are top of mind.
Take IDG’s 2020 IT Salary Survey: You’ll provide important data and have a chance to win $500.