Super Bowl 50: Inside the network built to handle record web traffic

Super Bowl 50 at Levi's Stadium in Santa Clara, California, is expected to drive a record amount of web traffic. Here's how the network will try to keep up.

Levi's Stadium Super Bowl 40 network Wi-Fi WiFi bandwidth capacity
San Francisco 49ers

Building a reliable Wi-Fi network is a challenge for businesses of all sizes. I'm sure all of us have experienced poor-quality Wi-Fi in places like hotels and wondered why it's so hard to put in a system that actually gives a good experience, especially when you have to pay for it. Some environments are tougher than others, particularly if there is a high concentration of people and devices. Last year, I authored this blog post describing the challenges the Orlando airport faced as it went fully wireless.

Without a doubt, one of the toughest venues to deploy wireless is in a stadium. There can be tens of thousands of people, all uploading pictures, videos, chatting, or streaming video in a highly concentrated area. Provisioning city-wide Wi-Fi can be difficult, but a stadium can have more people than many cities in an area that's a few hundred acres.

See also: 3D printing to make NFL debut at Super Bowl 50

To understand the challenges associated with stadium Wi-Fi, I interviewed Chuck Lukaszewski, VP of Wireless Strategy for Aruba (a Hewlett Packard Enterprise company) and the architect who helped design, build, and fine-tune the wireless network in Levi's Stadium, home of the San Francisco 49ers and Super Bowl 50 this Sunday (disclosure: Aruba Networks is a client of ZK Research). With such high expectations – Levi's Stadium is largely considered the most state-of-the-art, mobile-enabled stadium in the country – success this week is imperative. The stadium is located in the heart of Silicon Valley, so the attendees will be the most technically savvy users anywhere. The wireless experience must be fantastic.

Implementing a high-performance wireless network is more science than art, but it's important to know what metrics to measure and what the right levels should be. In this situation, Lukaszewski told me the company is focusing on two design metrics: take rate and concurrent load.

Take rate is a measurement of the percentage of seated people who will be accessing the network. Concurrent load is the number of users accessing the network at the same time. For stadiums, the average for these metrics is 50% and 25%, respectively.

However, Levi's is no ordinary stadium, and the events going on there are equally extraordinary. So, for this stadium, the assumption was 100% take rate and 50% concurrent load – twice what the average for this type of venue. While this might seem like overkill, the network is designed for excessive traffic. The network needs to support all of the typical applications that one would expect to see at an entertainment venue, such as Facebook, Snapchat, Instagram, and web browsing. However, the number one bandwidth-consuming application is Levi's Stadium's Stadium Application created by VenueNext.

The Stadium App is designed to enhance the game-day experience with a number of features, including mobile ticketing, mobile ordering of food and drinks, wayfinding to navigate around the stadium (see picture), and something called “game center” for HD video replays. When connected to Wi-Fi, game center enables fans to watch near real-time replays no matter where they are in the stadium.

levis stadium app

The magnitude of the network is far bigger than what most organizations have to deal with. There are 1,300 of Aruba's latest 802.11 AC access points (APs) placed around Levi's, with 12,00 beacons. This equates to about 1 AP for every 100 seats. Unlike traditional carpeted offices that have the APs placed overhead, the APs use an under-seat design (see image). By placing the APs under the seats, the ground clutter acts as a natural barrier to minimize the interference for the dense placement of the APs.

stadium aps

Another factor in designing the network is understanding the total network load during events. For some historical perspective, Super Bowl 47 in New Orleans in 2013 generated 1.1 TB of traffic. The following Super Bowl in MetLife Stadium saw traffic rise to 3.2 TB. A year ago, Super Bowl 49 in Phoenix saw the traffic almost double to 6 TB. The rise in bandwidth over the past few years has been remarkable. An average regular-season 49ers game generates about 2 TB of traffic, more than the Super Bowl only three years ago.

Also, the network needs to support more than just football games. The stadium is home to other events that will generate the same or more traffic than the average NFL game. For example, WWE Wrestlemania at Levi's Stadium in March generated 4.5 TB of traffic, and Taylor Swift fans produced a whopping 7.1 TB of data with half of the stadium closed off (for concert configuration).

For the upcoming Super Bowl 50, the assumption is that this game will nearly double the traffic from Super Bowl 49, with 10 TB being a reasonable benchmark. In fact, in football (gambling) vernacular, I asked Lukaszewski what he would choose if the over-under was 10 TB, and he said the over. So be prepared for a record amount of network traffic at Super Bowl 50.

The Big Game is less than a week away now, and if you're lucky to be attending, don't be afraid to take pictures or video of every big play and then upload them to your favorite social media site. You may not know who wins, but at least take comfort in knowing the network can handle it.

Copyright © 2016 IDG Communications, Inc.

The 10 most powerful companies in enterprise networking 2022