With AT&T announcing its sponsored data initiative, a federal appeals court ruling that the FCC can no longer protect net neutrality, and Comcast announcing a $45 billion acquisition of Time Warner Cable, business and consumers alike need accurate information on broadband performance more than ever.
Data is the one tool that customers have to identify when ISPs might be impeding performance based on business disputes with content providers.
For example, an article published at Gigaom earlier this month cites data provided by Measurement Lab (called M-Lab for short), a consortium formed in 2009 by Google, the Open Technology Institute, PlanetLab Consortium, and academic researchers from across the country that contributes to the FCC’s annual Measuring Broadband America report. The data referenced in the Gigaom article shows a steep decline in throughput speeds for Comcast, Time Warner Cable, and AT&T from early 2013 through December. By comparison, three other ISPs – Cox Communications, Cablevision Systems, and Charter Communications – showed no decline over the same period.
However, M-Lab’s data doesn’t quite square with that from other broadband testing services. The Gigaom article claims a source confirmed that SamKnows, a company that provides routers to consumers that track broadband speeds and whose data also contributes to the FCC’s report, was not spotting the same trends. The same goes for Ookla, the company behind the popular Speedtest.net website.
Thomas Gideon, the technology director at the Open Technology Institute, addresses these kinds of discrepancies between different broadband tests by pointing to M-Lab’s broad testing method and its overall openness. M-Lab’s network diagnostic test utilizes the Web 100 instrumentation package, and it generates roughly a half a terabyte per day, Gideon says. This data is collected from 12 different experiments run on networks that were volunteered for testing. All the raw data is open for review so others in the industry can contribute.
“All of this openness is so that, like with a drug trial with any kind of exploratory research, somebody else can take all of those inputs and ask good, insightful questions about our results, can advance that working collaboration, can use that data to try to see what insights it might provide on other aspects of the network that are measured by the NDT tool,” Gideon says.
As for the differences between M-Lab’s data and that of other providers, Gideon says its tests capture a broad set of data from a point that reflects end users’ network experience.
“We’re looking at the total state of the network as we find it, and looking for those organic findings as well, so we have our servers situated where they are so they’re very similar to the source’s choice that a content delivery network makes in terms of setting up its caching infrastructure or a content provider might make,” Gideon says. “So it’s a little closer to the sorts of paths that are likely to be involved when you’re dealing with this question of customer experience.”
M-Lab’s open standards don’t necessarily make it more accurate than other tests, however.
“I don’t think it’s a question of accuracy, I think it’s a question of what insights specifically are you looking to glean,” Gideon says. “If you’re looking to ask very narrow questions then you’re going to get very narrow answers.”
What would make broadband performance data more accurate, and simultaneously resolve the discrepancies between different tests, would be openness among all organizations that perform these tests, Gideon says. The Open Technology Institute has worked with the FCC and others in the space to try to promote open standards for broadband testing. Opening up the data is the only way to figure out why two tests of the same network may show different results, Gideon says.
“You do it this way, you share your findings and your data and methodology, in the way that you do so that other people can reproduce it, they can validate your results,” he says. “And they can also, in a case where things don’t quite line up, where there may be subsequent questions, they can collaborate and cooperate with you to improve over time so that you’re getting better and better results, you’re getting keener insights, you’re just getting a better picture of what’s going on.”
Gideon says SamKnows is very cooperative with its open policies, which M-Lab requires as a condition when it collaborates with other organizations. As for other companies that test broadband, he says M-Lab is always open to discussing partnerships that could go into deeper detail on performance trends.
However, direct collaboration wouldn’t be as important if the FCC were to establish standards for broadband testing, which Gideon says has become increasingly possible over the past few years. Crediting the FCC for bringing together the public interest, the federal government, and the private sector to establish a “good comparative basis across technologies,” Gideon says standards for broadband testing may not be that far away.
“I think there are plenty of actors, including the FCC, working on standards efforts, there’s a lot of stuff in flight,” Gideon says. “I wouldn’t presume to say that it’s all the way there yet, but I would definitely like to see from our perspective more open standards around these sorts of things, because I think it invites more people into the field from all across the space: from private sector, from academia, from public interest. And it puts us on that same footing that that we’d like to be on in terms of open and comparable methodologies, open and comparable tools.”
Colin Neagle covers emerging technologies and the startup scene for Network World. Follow him on Twitter @ntwrkwrldneagle and keep up with the Microsoft, Cisco and Open Source community blogs. Colin's email address is firstname.lastname@example.org.