Middle-east Arab News Opinion | Asharq Al-awsat

Pentagon … a new Challenge to Government Secrecy | ASHARQ AL-AWSAT English Archive 2005 -2017
Select Page
Media ID: 55350765
Caption:

The U.S. Pentagon


The Washington Post

By Sanford J. Ungar

Nothing is more important to the health and sustainability of a modern democracy than its citizens’ awareness of, and confidence in, what their government is doing. Excessive government secrecy — inherent, instinctive, utterly unnecessary and often bureaucratically self-protective — is poison to the well-being of civil society.

It is useful to remember this simple precept Monday, the 45th anniversary of the 1971 publication by the New York Times of the Pentagon Papers, a classified government history of decades of U.S. involvement in Southeast Asia and the untruths the public was told about it.

For the 17 days that followed, the Nixon administration and the press, already at odds, duked it out in the federal courts while the Times, The Post and other media withheld the information under judicial orders. Although the Supreme Court ruled, 6 to 3, on June 30 that the administration had not justified its demand for prior restraint on further publication, the legacy of the case has been a subject of argument ever since.

Because the subsequent criminal charges against Daniel Ellsberg for leaking the documents ended in a mistrial, the right of current or former government officials to reveal foreign policy misconduct has never been convincingly established. In the digital age, the boundaries, if any, of press freedom in the United States are more difficult to define. Whether the more recent secret-document dumps by Chelsea Manning and Edward Snowden were in the public interest remains unclear.

But one thing is certain: Government secrecy, especially in matters of foreign policy and national security, is worse than ever, and the over-classification of information increases by the day.
As a member of the Public Interest Declassification Board (PIDB), based at the National Archives, I have come to appreciate the mind-boggling dimensions of this problem:

The volume of the federal government’s classified “digital information assets” is growing at an astonishing pace. The Clinton Presidential Library has about four terabytes’ worth to be processed, and the George W. Bush Library 10 times as much. There is no official estimate of the amount generated during the Obama administration.

This fantastical tally does not include the uncountable classified electronic records held outside the presidential libraries, or the hundreds of millions of paper records going back decades and still being created.

In a classic instance of good intentions leading to problematic consequences, reforms requiring intelligence agencies to share more information with each other have created a new genre of secret documents — in which distinct parts of the bureaucracy all have their own interests to protect. Hence, much of the material must be circulated for repeated examinations before it can be released. Frequently, different agencies redact different portions of documents, and a further adjudication process may ensue.

Two current concepts offer a glimmer of hope. One, pressed by the PIDB, is to allow the National Declassification Center at the Archives to replace what its director calls the “factory” approach to document review (which often results in the declassification of routine information of little public interest) with a system of prioritization.

This would involve developing a consensus among interested parties, including Congress, historians and journalists, on an annual list of big-ticket topics on which declassified documents (even some not ordinarily due for review for 25 years) might shed particularly useful light — policy deliberations leading to the wars in Afghanistan and Iraq, for example. Arriving at each year’s list of issues for priority attention, naturally, could be extremely difficult.

More radical is the push to introduce widespread use of electronic declassification of sensitive documents, an effort sponsored, to the surprise of many, by the CIA, along with the Archives.
Research conducted by the Center for Content Understanding (CCU) at the University of Texas in Austin has shown that sophisticated computers can be taught the analytical ability to understand natural language and concepts as humans do, in order to provide “decision support” technology for classification and declassification alike.

Members of the intelligence community, who reflexively classify far too much information in the first place, worry about the grave risks that could arise from the 2 percent of mistakes the computers might make. You could see this on their faces.

But change, be it gradual or rapid, must come, and soon. The alternative is to continue relying on manual processes that can only be compared to trying to bail the water out of a sinking ship that’s about to get hit by a tsunami. And to expect periodic crises over leaks, orchestrated by people who think drastic measures are justified to let the public in on more of its own vital business.