While television and the Internet have both reshaped the way we consume information, there has been very little interaction between the two.
Like seventh-grade boys and girls attending their first dance, these two dominant forms of mass communication have stood awkwardly apart at opposite sides of the dance floor. Knowing they are supposed to mix, knowing they eventually will mix, they nonetheless aren't quite sure how to mix.
We will examine the technical factors that have caused this segregation and the new and evolved tools that will make true integration of television and the Internet possible. We will explore what that new world of integration will look like, point out the winners and losers, and provide a road map for success. We will also ponder the revolutionary impact this integration will have on society.
Take a look at the cable box sitting beneath your TV. What do you see? Bulky, heavy, dated, and perhaps emanating an annoying buzz from the spinning hard drive of a DVR. Its appearance and operation, with the exception of the DVR functionality, seems largely unchanged from its predecessors of the '80s (watch 1980s Solid Gold). Now take a look at your mobile phone. Light, sleek, futuristic, like something straight out of Star Trek.
Now imagine your mobile phone as your cable box. I'm not talking just about grainy YouTube videos of someone's cat playing the piano, I mean really watching TV- NBC, ESPN, HBO. But instead of being limited to the 200 or so channels offered by your cable TV provider, you can connect your smartphone to your TV and watch your choice of hundreds of thousands, if not millions, of channels all delivered over the Internet.
Over the last decade and a half, we've seen all forms of media, information, entertainment and commerce migrate to the Internet. Yet television has been the lone holdout, principally because the technology to deliver television in a scalable and reliable way just wasn't mature enough ... until now. We now stand at a point in time where a nexus of enabling technologies are all available to deliver this vision. Will it happen? Recent history has shown that once the Internet becomes capable of carrying a particular medium, it inevitably becomes a dominant platform for that medium. It's not a question of if, but rather when.
The implications of this next generation of television - NextGenTV- go well beyond the absurdity of complaints that "there's nothing on TV." The impact of NextGenTV on societies across the globe will be seismic as this revolution will be televised.
A nexus of enabling technologies
The recent availability of new, as well as evolved technologies in consumer devices and networking capability, makes the long-theorized possibility of the NextGenTV truly a reality today. Let's examine each of the components that comprise this vision.
First, the availability of smartphones and other inexpensive Internet-connected consumer devices like Slingboxes have made it possible to deliver technical functionality that most had never dreamed of. The combination of low-cost hardware components, the ubiquity of Ethernet and Wi-Fi availability within most homes, plus common, open software platforms like the Android operating system, are accelerating the pace of innovation. The functionality found in a cable box is actually quite trivial compared to the computational flexibility that smartphones and other inexpensive consumer devices are delivering already. Outputting an Internet-based video stream can be done today without having these devices break a sweat.
The next barrier has always been the greatest - the last-mile of Internet access. In the days of dialup and low-speed DSL access, video of any decent quality was a pipe dream. However, current DSL and cable modem speeds routinely deliver 10M to 20Mbps for roughly $50 per month. Add to that the recent introduction of 4G wireless availability, and delivering one or two high-definition TV streams of reasonable quality is well within the realm of possibility for the typical consumer.
The lack of low-cost consumer video output devices and insufficient last-mile bandwidth masked another major deficiency that prevented the possibility of TV over the Internet: the Internet was not up to the task. The Internet, based on a delivery mechanism known as unicast, simply could not deliver a high-bandwidth video stream to millions of simultaneous viewers in an economical manner. However, recent protocol advancements, which simplify multicasting, have enabled the Internet infrastructure to support high-bandwidth content to arbitrarily large audiences at minimal cost to the content provider. To understand this critical component of NextGenTV, let's take a closer look at unicast, broadcast and multicast data delivery.
Unicast, broadcast, multicast
The majority of Internet traffic uses unicast data delivery. In unicast, a server transmits data directly to the client requesting it. Each client requesting data gets its own stream from the server. The cost of unicast delivery increases linearly with the audience size, as the source must be powerful enough to transmit a duplicate stream to every interested end device, and the links on the network must have enough bandwidth to handle all the duplicate streams.
By contrast, broadcast data delivery allows a server to send a single stream to the network, which will be received by all end users (whether they are interested in the data or not). For example, an old-fashioned over-the-air radio station broadcasts its signal to all radios within a given area. If a radio station were to use unicast delivery, it would transmit a separate signal to each interested listener. If there were 100 interested listeners, a "unicast" radio station would have to transmit 100 different signals. Instead, a radio station broadcasts a single signal that all radios receive.
The benefit of broadcasting traffic is most obvious for the owner of content - whether there is one listener or 1 million listeners, the cost to transmit remains the same. The disadvantage is that the traffic is sent to all users, whether they are interested or not. By comparison, unicast traffic is delivered only to users that explicitly request it. Unicast traffic will not be flooded to uninterested users.
Broadcast works well in limited geographic areas or on small networks. However, on the Internet, which connects millions of networks and billions of end devices, broadcasting traffic to all those end devices is simply not feasible. Hence, unicast is much better suited to the Internet, even if it is inefficient in delivering multi-destination traffic.
Between the two extremes of unicast and broadcast stands a third option: multicast. In multicast, the source transmits a single stream. However, the network intelligently determines where that content is desired and delivers the stream only to interested receivers. By delivering a single stream of data, a multicast source enjoys the same efficiency of broadcast in that the cost of transmission remains constant whether the audience consists of one person or a million people. And by delivering that content only to end users who are actually interested in the content, multicast enjoys a similar efficiency of unicast in that traffic is not flooded to end users who have no interest in the content. Multicast enjoys a "best of both worlds" by realizing the benefits of unicast and broadcast without suffering their deficiencies.
So what's the catch? Why isn't multicast deployed ubiquitously across the Internet? Unfortunately, multicast requires a rather complex set of protocols that determine where the interested receivers are and replicating traffic only to those end users.
In the late 1990s, there was much hope and hype surrounding multicast over the Internet. It was predicted that every dorm room could become a TV or radio station and audio/video streams would be as numerous as Web sites. Multicast advocates focused on "encouraging" service providers to deploy the protocols necessary to support multicast on their networks. Several very large service providers, such as Sprint and Level 3, deployed multicast on their Internet backbones. And on research and education networks, like Internet2, multicast became a crucial service offering. However, for various reasons, multicast deployment on downstream networks was sporadic and in most cases non-existent.
The biggest problem multicast presented was an "all or nothing" solution. Every link on the network, every router and firewall between source and receiver, required multicast protocols to be enabled. Additionally, the business model for multicast is abstract and is not an easy case to make. Multicast is an infrastructure capability that enables other services. From a business perspective, multicast resembles DNS and BGP, which are vital infrastructure protocols that are generally not billed directly. Consequently, those service providers who tried to bill for Internet multicast found disappointing results. Content providers were not interested in paying extra to transmit multicast streams that couldn't be received by many end users, and networks with many end users were unwilling to pay extra to receive multicast content that didn't exist. The result was a chicken-and-egg problem between content and audience.
It should be noted that multicast has enjoyed success in certain places. On financial networks, multicast is a vital service as applications such as stock quote data is delivered from one central market location to thousands of traders simultaneously. On some enterprise networks, multicast is equally indispensible as it is used for such purposes as transferring price lists from a central headquarters locations to thousands of local retail stores.
Additionally, multicast is often used on corporate networks to deliver live video of important corporate events, such as when the CEO speaks to thousands of remote employees. For these and many other applications, multicast has enjoyed tremendous growth in recent years on IP VPN networks. However, on the Internet, multicast deployment has been an undeniable disappointment. While roughly 10% of the Internet is enabled for multicast, given the "all or nothing" nature of multicast, in most cases that 10% might as well be 0%.
A new hope: AMT
As the efforts toward deploying multicast on Internet networks stalled, a new solution emerged. Multicast advocates began to recognize and accept the reality that "encouraging" all networks to deploy the protocols necessary to support multicast was simply not practical. These advocates then noticed that IPv6 shared the same "all or nothing" properties of multicast.
Like multicast, isolated pockets of IPv6-enabled networks existed like islands within the ocean of the (IPv4-only) Internet. IPv6 architects had spent considerable effort developing transition mechanisms that would allow these IPv6 "islands" to connect to one another across the abyss of v6-disconnectedness. Multicast architects decided to leverage/steal one such idea and apply it to the multicast problem. This solution became known as Automatic IP Multicast Without Explicit Tunnels, or AMT.
AMT has enjoyed wide popularity since its inception and is seen by multicast advocates as the last, best hope for Internet multicast. AMT accepts the reality that unicast-only networks do exist, and simply allows end users to "hop" over those networks. To accomplish this, AMT uses tunnels to connect users on unicast-only networks to content on multicast-enabled networks.
To support this model, multicast-enabled providers deploy AMT Relays at the edge of their networks. These relays act as the tunnel endpoints that "translate" native (untunneled) multicast content to users on unicast-only networks. An AMT Relay can be a standalone server, or more commonly, can be functionality added to existing edge routers on a multicast network.
Users on the unicast-only network sit behind AMT Gateways, which use an autodiscovery mechanism known as anycast to locate the nearest relay and then initiate a multicast tunnel to this relay. The gateway then requests a multicast stream of interest through the tunnel.
The relay receives the request and joins the multicast content using standard multicast routing protocols. The multicast stream is forwarded through the multicast-enabled network to the relay which forwards the stream over the tunnel to the gateway. The AMT gateway can be software run on the actual host PC or can be run on a home router which acts as gateway for all hosts at the home site (for example, if there are multiple computers in the home).
The result is that users on unicast-only networks receive multicast content. No action is needed by the end user's provider - the end user simply hops right over that uncooperative network to join the party on the multicast island. As a side benefit, the unicast-only network provider will begin to notice more of these AMT streams being tunneled over their network. This will be motivation to add multicast support to the network, as it will eliminate duplicate streams of tunneled (unicast) data and utilize the network more efficiently.
In this way, AMT can be viewed as a vital interim solution in the transition from unicast-only to multicast-enabled Internet networks. Of course, those providers are free to remain obstinately unicast-only and carry duplicate traffic across their networks. Most importantly, the end users are able to receive content from the sources.
AMT is not a new solution. In fact the AMT specification was originally drafted in 2001. What is new is that router vendors have recently added support for AMT in their large carrier routers, allowing service providers to offer AMT service in a scalable, manageable and profitable way.
The promise of Internet television was once a hallmark of the early Internet boom. But the multicast delivery mechanism required to make this vision possible ran into technical and economic real world obstacles. The recent availability of AMT helps hurdle those obstacles, restoring the possibility and promise of multicast. With multicast in place, the network is now ready to handle NextGenTV.
What NextGenTV would look like
If you wanted to start up a new television channel that was available on all the major cable and satellite systems and viewable to most Americans, a ballpark estimate of the cost would be in the hundreds of millions of dollars. Such high costs tend to keep out the riff raff. The number of television channels in existence is in the thousands, while the number available on most cable or satellite systems is typically in the hundreds. The number of American television viewers is roughly 300 million.
By contrast, there are more than 200 million Web sites on the Internet, which includes almost 2 billion users. Cost is the principle reason for this disparity. The cost of publishing Web content that could potentially reach a third of the planet's population can be done for tens of dollars per month. But the same cost model doesn't apply to video currently.
A highly-rated television show may be seen by 10 million viewers. Imagine transmitting that content on the Internet using unicast. A high definition stream of reasonable quality can be transmitted at 10Mbps. Transmitting 10 million of those 10Mbps streams on the Internet would require 100Tbps of bandwidth and a video server capable of originating that many streams.
Transmitting this much content simply cannot be done economically on the Internet using unicast. Content delivery networks (CDN), which are used to distribute the content load across the network geographically, could not appreciably change this equation. CDNs would merely distribute the problem. On the unicast-only Internet, the cost of transmitting simultaneous streams increases linearly with the size of the audience.
However with multicast, a video server with only the capacity and bandwidth of a single 10Mbps stream could deliver that same show to all 10 million viewers. Armed only with the Internet connectivity commonly available to broadband subscribers, one could easily transmit a single high-definition video stream that could be simultaneously viewed by much of humanity. This is because the cost of transmitting remains constant regardless of the size of the audience. Whether there is one viewer, or 1 billion, the cost of transmission is the same for the source.
AMT does introduce some per-viewer costs to the network provider since it does replicate to the unicast world. However, this is a fraction of the cost of typical unicast delivery, even with CDNs, as the replication point (the AMT relay) can be built into the network infrastructure and placed at the edge of the multicast-enabled world. Also, as AMT succeeds, native multicast becomes an increasingly attractive option for network providers. AMT is really just a (necessary) interim step toward full (or near-full) multicast connectivity on the Internet.
Now it is fair to point out that our sample television show with 10 million viewers wouldn't necessarily have 10 million simultaneous viewers. TV shows are typically staggered across the different time zones. Also, DVRs have made it fairly common for viewers to record shows to watch at a later time. However, it should also be pointed out that there is much television content, like sporting events, that is typically consumed live. For example, 106 million viewers worldwide tuned in to watch the Saints defeat the Colts in Super Bowl XLIV. The vast majority watched this game on live TV.
So who will create this new NextGen TV content when it only costs a few tens of dollars a month to be a television channel? Toward the end of the previous century, it was envisioned that with multicast, every college student in their dorm room, every person with an idea and the passion to share it could become a TV channel, and the number of channels would be as numerous as the number of Web sites on the Internet. But YouTube drastically changed that equation.
Being a television channel typically means transmitting content 24x7. It turns out that's a lot of time to fill and most college students and people with ideas/passion don't have that much material. They have video to share, but maybe just a few minutes of it each day. YouTube fit that role perfectly. For this reason, the number of NextGenTV channels isn't likely to approach the number of Web sites on the Internet.
However, there still are a significant number of potential sources out there who have enough programming to fill the day. High schools could show their sporting events and coverage of other extracurricular activities that might have large enough appeal to gather significant viewership from remote audiences (family, friends and other high school sports fans). When traffic and weather cams can deliver indispensible content to thousands at a fractional cost than what is currently available, you will see far more traffic and weather cams. International programming could provide enormous opportunities. Today, there might be a single TV channel available on cable/satellite to cover a country or region. With NextGenTV, there could be dozens.
In nations where the state controls all media outlets, the thirst for independent content sources is enormous. Here, the opportunity for NextGenTV goes well beyond adding banal content to the existing channel lineup; it could spark revolutions and change regimes. It's one thing to have Web sites and short YouTube clips available that shine a light on oppressive governments. It's quite another to have television channels broadcasting content continuously to the masses. The political and societal impact of viewing these previously unseen images on television screens will change the course of history.
Winners and losers of NextGen TV
With a television lineup that included potentially millions of channels, the most obvious winner would be the viewer. Viewers would enjoy brand new content that previously wasn't possible - grandparents could watch their grandchildren compete in sporting events thousands of miles away, immigrants could reconnect with their homelands, etc. However, viewers of traditional television would also be big winners.
Since the advent of cable television, consumers have always griped about bundle pricing. Cable TV providers act as a middleman between TV stations and consumers. Stations like ESPN and CNN charge the cable provider for each viewer capable of receiving their content. Cable providers pass that charge along to the customer, with some markup along the way. Cable providers also bundle a number of stations into packages or tiers. For example, if you want to subscribe to ESPN, you typically have to subscribe to a basic cable package that includes numerous other channels, like Lifetime Television. So no matter how little interest an ESPN viewer has in watching abduction-themed movie marathons, he will still have to pay for Lifetime when he subscribes to ESPN.
TV stations have always wanted to cut out the middleman and charge subscribers directly. To free themselves from unwanted bundles and tiers, viewers have sought to subscribe only to the channels they want. TV stations would also find that distributing their content to viewers over the Internet is far cheaper than operating the video distribution networks typically used today to bring content to cable providers.
At first glance, cable providers would be the obvious big losers in a world where they can easily be bypassed. A new beneficiary would be content aggregators. Today, companies like Google, eBay and Apple, using different business models, have developed a core competency to connect individual buyers/sellers of content/information/services/products. Companies like these could find a great opportunity to efficiently connect subscribers to the potentially millions of TV channels that could be available in the NextGen TV marketplace.
Cable providers could certainly attempt to provide this marketplace role. However, they are likely to find that their corporate DNA lacks the qualities to make this role a successful fit. Large cable providers/phone companies tend not to be strong in the area of creating innovative services and content. To be fair, very few companies are successful at this. For every Google, there's a dozen Hotbots; for every Hulu, there's a hundred Pixelons.
But failure in the NextGenTV space isn't inevitable for cable providers. They need not follow the path of obstructionism laid by the recording industry, where iTunes stepped in and dominated the online music distribution industry that was rightfully theirs. They need not fade from view slowly like the newspaper industry, unable to capitalize on the vast opportunities of online media. They need not stand athwart the train tracks of Internet history crying "halt!"
While cable providers and telephone companies tend to struggle in the area of creating innovative content and services, they do possess a strength and core competency in delivering this content and services in a scalable, reliable and economical way to a large number of consumers. And by providing the Internet connectivity of NextGenTV consumers - by owning the audience - they are uniquely positioned for new and creative revenue opportunities like ad insertion and content metering.
For example, it is undeniable that Internet connectivity is less reliable than typical cable television service. There are many reasons for this disparity, not least of which being that Internet connectivity between two points is only as good as its weakest link. ISPs can only control the quality of the links within their network, but can do little to affect what occurs beyond their borders. Thus, an ISP with an abundance of end users (i.e., cable modem/DSL providers) can offer TV stations like ESPN peering connectivity, a fancy term for cheaper Internet access, to sit closer to the end users and ensure a better, more reliable viewing experience for subscribers. This provides revenue streams from content providers as well as a differentiator to attract more residential broadband users. Contrary to popular belief, bandwidth/connectivity is not a commodity. When choosing between the cable company or the phone company for residential broadband service, being able to provide and demonstrate a better connectivity experience will make an impact in that decision.
Additionally, some video content has especially stringent requirements for reliability. In the 4th quarter of a playoff game 7, no viewer wants to see the dreaded ... "buffering"... One need only look at telephone service to find an analogous situation. For decades, telephony engineers have preached the need for 99.999%, or "five nines" reliability when it comes to phone service. This equates to 5.26 minutes of downtime per year.
While this may have been true for traditional wireline phone service, mobile phones have conditioned users to accept far less reliability. At times it may seem that mobile phone service delivers availability which is closer to "nine fives" than "five nines." People have grown to accept lesser voice quality and reliability for the fantastic benefits a mobile phone afford. However, while a mobile phone may provide all the functionality of phone service, relatively few people have totally abandoned the traditional landline at home. And if a mobile phone and a landline are both within arms reach and someone plans to remain within their house for the duration of the call, the landline will typically get used.
Cable providers can apply this lesson from the telephony world. Rather than fight the wave of NextGenTV, they can ride it with their well-equipped surfboards. Cable providers could offer a "hybrid" or "enhanced" cable box, which could support both traditional cable TV service along with Internet connectivity to deliver NextGenTV.
Viewers could get the best of both worlds - the quality, stability and reliability of traditional cable TV integrated with all the benefits and functionality of NextGenTV. This will require cable providers to adopt open and standardized interfaces for their equipment and cooperation, partnerships and interoperability with the leading content aggregators. And of course, cable providers will also need to cut prices and provide more flexible bundling options for traditional services in order to keep customers.
Again, telephony is instructive here, as telephone companies learned a decade ago that long-distance service needed to be cheaper in order to compete. Vonage looked much less attractive to consumers when domestic long-distance service rates dropped to near-zero. While this approach may sacrifice revenue streams for cable providers that have been safe and stable for decades, it does open the opportunity for newer and potentially more lucrative services.
Those companies who own the end user are well situated to observe, understand and capitalize on the end user experience. The company that provides the intelligent end device delivers NextGenTV to the consumer's television has the ability to see exactly what that consumer watches.
Today, television metering is dominated by the Nielsen ratings, which provide statistical sampling of users. A NextGenTV device would provide far more sophisticated usage data - what exactly is being watched, duration of viewing, demographic info of the viewer along with geographic location. This type of information is extremely valuable for the purpose of targeted advertisement. Rather than expensively blanket an entire medium with advertisements when most viewers are uninterested or possibly unable to purchase a particular product, it is far more effective to send ads only to those who are most likely to use a product.
Making the unforeseen possible
When Timothy Berners-Lee invented the Web, and Marc Andreessen invented the first modern browser to navigate it, users immediately recognized that this was a powerful development. While it was quite clear to early users that the Web and the browser were truly amazing tools of great promise, no one could have predicted eBay, Amazon, Google, Wikipedia, YouTube, and the other descendant technologies that are integrated today in the daily lives of billions across the planet.
This is a critical characteristic of truly revolutionary developments- not merely augmenting or improving existing functionality, but rather making possible a new array of functionality that was previously unfeasible and paving the way for the inconceivable. Where the Web and the browser were the enabling technologies for the multitude of innovations that we have since experienced with the Internet, multicast will be the critical enabler for an unforeseen world of new functions and services towards which NextGenTV will evolve.
Maybe you're still not convinced about multicast. With all the other components well in place - cheap, intelligent end devices, ample access bandwidth, and willing participants already stepping into the ring (see AppleTV, GoogleTV), it is fair to ask if multicast is really necessary, especially in the on-demand world in which we now live. After all, video seems to sort of work just fine in the unicast-only Internet of today.
While it's true that there is a plethora of video on the Internet today, and the promise of a new wave arriving imminently, the numbers simply do not lie. Unicast delivery of high bandwidth multi-destination content is expensive, while multicast delivery is cheap. What unicast solutions may be able to provide is a certain niche of functionality - content with an audience that is large enough to afford the high cost of duplicate transmission, but small enough that the immutable laws of large numbers make it impossible for the network to deliver.
Such a niche world may indeed find modest success in adoption, but it will serve as only a diversion from what we think of today as television viewing. It could never fully overtake cable television and deliver all the revolutionary features and applications outlined here as NextGenTV. At best, a unicast-only solution could augment or improve existing functionality, but only multicast can deliver the unforeseen.
Where unicast may deliver "good enough, most of the time, for most existing content," it is critical to examine what happens during extraordinary times. After all, television has had its greatest impact during moments such as natural disasters, political speeches, military conflicts, assassinations and other times of great historical significance.
At no time was this more apparent than during the morning of Sept. 11, 2001. As the horrific events were unfolding, most news Web sites quickly became inaccessible due to the high load caused by so many users simultaneously trying to access content. Meanwhile, at Northwestern University, the CNN video feed was streamed over the Internet using multicast and quickly gathered an audience of over 2,000 viewers. At the time it was believed to be the largest audience ever for a single multicast stream. While still a relatively miniscule set of viewers, this demonstrated the power of multicast in its ability to deliver content to an arbitrarily large audience during times of extreme interest.
During such consequential events, a unicast-only television solution would fail miserably to handle the load. "Good enough, most of the time, for most existing content" would yield to "unreliable when it matters" and could provide a crushing blow to confidence in Internet delivery of television, potentially causing a huge setback for the medium.
In addition to the availability of carrier-grade AMT support, ISPs have another motivating opportunity to add multicast support to their networks: IPv6. With IPv4 address availability approaching total exhaustion- some countdown Web sites suggest this will occur in less than a year - IPv6 is finally becoming a reality.
Most ISPs and corporate networks, which held out deploying IPv6 widely in the same way they held out deploying multicast, are finally beginning to take IPv6 deployment seriously. While engineers are already "under the hood" adding IPv6 support to all network equipment, it is a great opportunity to add multicast support. The incremental effort to add multicast is actually much less than adding IPv6 support. Network architects can take this opportunity to add multicast support for IPv6 alone or for both IPv4 and IPv6. Either way, there is no better time to enable multicast, paving the way for the full benefits of NextGenTV.
Giuliano is an engineer in the telecom industry focused on IP Multicast technologies. He coauthored "Interdomain Multicast Routing: Practical Juniper Networks and Cisco Systems Solutions" (Addison-Wesley 2002) and is the co-chair of the Multicast Backbone Deployment (MBONED) Working Group at the Internet Engineering Task Force (IETF). He welcomes all comments to email@example.com.