Given last year's revelations about the National Security Agency's (NSA) massive surveillance and data analytics conducted on Americans, along with continuing stories about local police scanning thousands of license plates per day, it might sound absurd to say that government lags behind the private sector in the use of Big Data analytics.
But those examples tend to be outliers among the nation's sprawling bureaucracies, especially at the state and local levels. In general, the private sector is well ahead of the public sector in the use of Big Data analytics, according to a recent report titled "Realizing the promise of Big Data," sponsored by the IBM Center for the Business of Government.
While the report's author, Kevin Desouza, an associate dean for research at Arizona State University, cited multiple examples of it being used in government, he found that the overall promise of Big Data analytics is largely unrealized so far in the public sector. He called it, "a new frontier" for government at all levels.
There are multiple reasons for that, Desouza concluded, after interviewing 22 federal, state and local chief information officers (CIO). One of the most significant is that those outside of IT don't yet understand the concept, or are fearful that the public will perceive Big Data as Big Brother.
"Outside of IT no one gets the term," one CIO told Desouza. "The few (managers) that do have some views on it have been put off due to the privacy concerns because of the NSA's surveillance program."
Those concerns are legitimate, but lack some context. Kelly Stirman, director of product marketing at MongoDB, said what many other experts say: "The private sector knows much more about you than government ever will. But people have been willing to trade their information for convenience."
That is what Desouza found -- that the private sector has been able, through things like terms-of-service agreements for mobile apps, "to collect immense amounts of information on individuals with limited pushback."
With government, it is different. There were ferocious protests last year after revelations from former NSA contractor Edward Snowden about the extent of the agency's surveillance of U.S. citizens, including reports that on a single day the NSA had collected hundreds of thousands of email address books from vendors including Yahoo, Microsoft's Hotmail, Facebook, Google's Gmail service and others.
Beyond that is the reality that the private sector tends to be more nimble than government and is better able to attract top talent. Government is more likely to be constrained by both political pressure and bureaucracy.
Kimberley Williams, chief strategist, public sector, at Informatica, said none of this should be a surprise, since the roles and motives of the private and public sector are different.
While government has been using Big Data for years in areas like intelligence, the census, taxes and Social Security, "what is new to government is the realization that Big Data can and should be used to drive, reform and refine public policy, as well as defend and protect," she said.
A.J. Clark, president of Thermopylae said that investments in counter-terrorism after 9/11 meant that, "certain elements of the public sector led the Big Data movement. Statistics from agencies involved with Remote Sensing -- satellite imaging, full motion video collection, aerial imaging such as Light Detection and Ranging (LiDAR) -- show that there has been a watershed of big data growing in their environments," he said.
But, "many of the advances in big data did not transcend the agencies they originated in and the public sector as a whole did not benefit," he said.
Williams said another factor is that the private sector's use of Big Data is driven by the profit motive and competition, while, "government is not profit-driven and maintains a captive audience. So, the drivers of Big Data projects in government are inherently different and frankly not as urgent, except for intelligence and law enforcement."
She estimates that most of the public sector is about five years behind the private, "with no imperative to catch up and the reality that Big Data projects may remain an unfunded mandate or underfunded project/program."
Other experts agree, although there are mixed opinions on how far behind. Clark estimates it to be 18 months, while Chris Petersen, CTO and cofounder of LogRhythm, said he thinks it amounts to "years," but that the gap is less in the area of cyber security.
CIOs admit that they are behind. They told Desouza that they simply don't have the technical capability or staff with enough expertise to conduct Big Data analytics. Not one had made use of unstructured data.
"Isn't that a critical element of Big Data? If so, then we are not doing anything in the Big Data space, as we have not touched unstructured data. All of our data has some structure, and most of it is highly structured," one of them said.
But it has exploded in the private sector. The Boston Globe reported more than two years ago that Massachusetts alone is home to more than 100 companies focused on Big Data.
Pam Dixon, executive director of the World Privacy Forum (WPF), estimated last fall that there were about 4,000 data brokers in the U.S., all of them collecting and selling data on people's activities, both online and off -- license plate scanning is not confined to law enforcement departments.
And there are hundreds of stories of private firms using Big Data to gain market share, improve their bottom line, curb fraud and theft and become more efficient.
Compass Group Canada, which operates more than 2,000 food service locations in that country, recently began using software from Boston-based Lavastorm Analytics to analyze data on money or merchandise either being misplaced or stolen by employees or customers. They found the analysis much more efficient than analyzing thousands of hours of video footage.
Desouza also cited Merck's use of weather data in July 2012 to anticipate greater demand for its allergy pill, Claritin, in May 2013, and Google's early detection model of flu outbreaks, called Google Flu Trends that, in 2009, was able to track trends in the H1N1 flu epidemic "days faster" than the federal Centers for Disease Control.
That doesn't mean there is no progress in the public sector. Boston's former police commissioner, Edward Davis, recently joined the board of Mark43, a tech startup founded by three Harvard graduates that makes software to analyze data on crime and gangs, and improves management of police records on suspects.
The good news for the fledgling company -- but not so good for the reputation of government services -- is that the field is relatively wide open. The class project that led to the founding of Mark43 was to come up with a program that could use analytics to track gang relationships in Springfield, Mass.
"We didn't know much about law enforcement," said CEO and cofounder Scott Crouch. But, "the first thing we noticed was that their law enforcement software was awful."
Improving the technology and the productive analysis of data by government clearly could make government more efficient and offers the hope of curbing the classic bugaboos of "waste, fraud and abuse."
"Analytics now holds great promise for increasing the efficiency of operations, mitigating risks, and increasing citizen engagement and public value," Desouza wrote in his report.
But it will take some changes, experts say. "Human capital is clearly a major problem," Williams said. "Government has a serious lack of scientists, an aging workforce, ongoing budget and pension challenges and an inability to attract technology-savvy employees."
Government is also hamstrung by long-term contracts with vendors supplying proprietary technology, Stirman said. "Because government agencies are so large, they get into contracts worth hundreds of millions, and they can't easily untangle themselves even if there are problems."
The hope for change, he said, lies in the fact that, "most of the (Big Data) technologies are open-source, which means you don't have to get locked in with a vendor with proprietary technology. Anyone can learn it -- you can become an expert without going through a big certification process. It's really important that government take advantage of it," he said.
It will also take more cooperation, according to Clark. "Data sharing is critical for Big Data analytics, and I see that the private sector as a whole is able to share their data more easily with one another than the public sector has been able to over the last decade" he said.
And it will take investment. Petersen said he thinks the "overall impediment to use of Big Data in the public sector is cost. Big Data projects are expensive, very similar to home-grown application development." But, done right, they could, "likely enable the delivery of higher quality services at a lower overall cost," he said.
Clark said it is important for the public sector to make that investment to avoid a widening "technology gap. If the approximate 18-month gap grows to over 36 months, which is two technology cycles based on Moore's Law, it will be very difficult for the public sector to leverage the cost efficiencies of private commercial technology solutions," he said.
But Gary King, director of the Institute for Quantitative Social Science at Harvard University, said things can change quickly in both the private and public sector. "Large parts of the commercial sector are also far behind what could be done," he said. "This is a dynamic field with fast progress."
This story, "Big Data still 'a new frontier' for most of the public sector" was originally published by CSO.