The US government's overly complicated way of classifying and declassifying information needs to be dumped and reinvented with the help of a huge technology injection if it is to keep from being buried under its own weight.
That was one of the main conclusions of a government board tasked with making recommendations on exactly how the government should transform the current security classification system.
"Current page-by-page review processes are unsustainable in an era of gigabytes and yottabytes. New and existing technologies must be integrated into new processes that allow greater information storage, retrieval, and sharing. We must incorporate technology into an automated declassification process," the congressional Public Interest Declassification Board stated. "The current classification system is fraught with problems. In its mission to support national security, it keeps too many secrets, and keeps them too long; it is overly complex; it obstructs desirable information sharing inside of government and with the public. There are many explanations for over-classification: most classification occurs by rote; criteria and agency guidance have not kept pace with the information explosion; and despite the Presidential order to refrain from unwarranted classification, a culture persists that defaults to the avoidance of risk rather than its proper management."
The board added that the security classification problem is growing. Agencies are creating petabytes of classified information annually, which quickly outpaces the amount of information the government has declassified in total in the previous seventeen years since Executive Order 12958 established the policy of automatic declassification for 25 year old records. Without dramatic improvement in the declassification process, the rate at which classified records are being created will drive an exponential growth in the archival backlog of classified records awaiting declassification, and public access to the nation's history will deteriorate further, the report stated.
At the heart of the classification revamp should be a number of high-tech implementations. Available technologies, such as context accumulation, predictive analytics and artificial intelligence, should be piloted to study their effectiveness on helping implement these recommendations and to engage users and garner their trust in a new system, the board wrote
Promising new technologies should be tested through a series of pilot projects, once proven, can be deployed at multiple agencies and then expanded to include pilot projects for classification. The ultimate goal of these pilots is to discover, develop and deploy technology that will:
High tech is only part of the major recommendations the board suggests. A complete overhaul in how information is classified is likely a more sticky point.
IN THE NEWS: The year in madly cool robots
From the report: "Classification should be simplified and rationalized by placing national security information in only two categories. This would align with the actual two-tiered practices existing throughout government, regarding security clearance investigations, physical safeguarding, and information systems domains. Top Secret would remain the Higher-Level category, retaining its current, high level of protection. All other classified information would be categorized at a Lower-Level, which would follow standards for a lower level of protection. Both categories would include compartmented and special access information, as they do today. Newly established criteria for classifying information in the two tiers would identify the needed levels of protection against disclosure of the information. Using identifiable risk as the basis for classification criteria should help in deciding if classification is warranted and, if so, at what level and duration."
In the end the board made 14 recommendations that would modernize the current system of security classification and declassification. The recommendations: