The article in TIME is headlined “Google’s Flu Project Shows the Failings of Big Data.” However, critics says the real failing here is not with big data but with Google.
The article takes issue with a project named Google Flu Trends (GFT), pioneered by the Internet search giant to produce real-time monitoring of flu cases around the world using search data the company collects. The idea was that analyzing how many people are searching for flu terms in an area can predict where there are cases of the flu.
The work was lauded in a book, “Big Data: A Revolution That Will Transform How We Live, Work and Think.” Google admitted at the time that not everyone who searches for flu terms would be sick, but it said it found “a close relationship” between search terms and flu cases.
The only problem is that it didn’t.
The journal Science released a reports showing some flaws in GFT. Specifically, it said that GFT’s predictions of flu cases were overestimated by 50% or more in some cases compared to figures produced by the federal Centers of Disease Control (CDC).
“From August 2011 to September 2013, GFT over-predicted the prevalence of the flu in 100 out 108 weeks,” TIME reported. “During the peak flu season last winter, GFT would have had us believe that 11% of the U.S. had influenza, nearly double the CDC numbers of 6%.”
TIME goes on, “just because companies like Google can amass an astounding amount of information about the world doesn’t mean they’re always capable of processing that information to produce an accurate picture of what’s going on—especially if turns out they’re gathering the wrong information.”
So what do big data enthusiasts think of all this? They point to the specifics of Google’s approach in critiquing GFT, not big data in general. “What happened with Google wasn’t a failure of big data,” says Charles Caldwell, director of solutions engineering, Logi Analytics. “It is about believing that big data can be a replacement for everything else.”
It’s not a surprise that a team of professional epidemiologists at the CDC will have better information about the flu that an Internet search company. For big data projects, it’s about picking the right tools and having the right data. That didn’t happen with the GFT, but Caldwell says that’s a failure of the project, not of big data in general. “Big data needs to support human expertise, not replace it.”
So is big data overhyped as being a panacea? “Absolutely,” or at least the term is says Clarke Patterson, senior director of product marketing at Cloudera, which is one of the leading companies delivering Hadoop, the big data platform as a product. The fact of that matter is that there is a huge amount of new data that businesses and researchers have access to. But it’s not just about having data, it’s about knowing what to do with it and getting true insights out of it.
“Unfortunately, this transformation is in its early stages and as a result projects are going to fail (like the Google GFT example) if we get over excited about the technology alone,” he says.
A few bad apples shouldn’t spoil the bushel, says Jim Ingle, SVP at NTT Data, which consults with companies to hone a data management strategy. Many companies, he concedes don’t have the need for a big data platform. But, he says traditional data warehousing tools are also not idea. New data platforms allow for faster and easier access to data. “It is difficult to effectively predetermine how an organization will want to access and analyze its data over time,” he says. “Flexibility and speed of data analysis is the future and big data technologies enable this regardless of whether you have massive amounts of data or not.”
Senior Writer Brandon Butler covers cloud computing for Network World and NetworkWorld.com. He can be reached at BButler@nww.com and found on Twitter at @BButlerNWW. Read his Cloud Chronicles here. http://www.networkworld.com/community/blog/26163