Sunday, July 25, 2010

Filter failure and other problems

A couple of years ago, Clay Shirky challenged the idea that the digital age is overloading us with information. He asserted that following the invention of the printing press, the sum total of available information has always been far, far larger than a single human being could ever hope to take in over the course of his or her lifetime. We tend to forget this fact because, in the centuries following Gutenberg, human beings devised a variety of strategies for honing in on the paper-based information that was of greatest value to them. (And as the volume of government and other records grew exponentially during the 20th century, a variety of strategies for dealing with them emerged, among them formalized records management, the appraisal theories of Schellenberg and others, and More Product, Less Process processing.)

These strategies don't work as well in the digital age. Information technology has removed many of the economic constraints that limited the production, dissemination, and storage of paper-based information, and as a result we have unprecedented access to all kinds of information -- some produced by professionals, some by amateurs, some of it vetted and edited, some of it not. Shirky asserts that feeling overwhelmed by all of this stuff is simply evidence that our traditional mechanisms for finding information that we need and want and disregarding the rest aren't working. In other words, we're experiencing "filter failure." Until we devise new ways of separating the digital wheat from the chaff -- something that will no doubt happen eventually -- we're going to feel overloaded.

Shirky's onto something, and the concept of filter failure popped into mind when I read Stuart Fox's recent TechNewsDaily piece concerning the challenges that future historians of the digital age will face. In essence, their work will shift "from the archeological recovery of rare texts and letters to the process of sifting through vast fields of digital information that weave through legal gray areas of corporate and private ownership." Fox notes that historians who can create or use new data analysis tools will likely be able to produce social histories of unprecedented breadth and depth, and I think he's right: scholars (one of my colleagues among them) are already starting to apply social network analysis and other tools to archival electronic records, and I'm certain that all kinds of new tools will be developed in the coming years.

However, as Shirky would be the first to admit, filter failure isn't the only problem. As Fox points out, corporate control of steadily increasing quantities of the data that will be of interest to them is likely going to be an even bigger challenge for future generations of historians. Although the Library of Congress is preserving every public Twitter posting, and there will no doubt be other initiatives designed to preserve social networking and other data documenting the lives of ordinary people, I suspect that a lot of this data will not survive because the corporations that control it have no incentive to preserve or provide access to it. (I also suspect that Fox's sources -- one of whom asserts, incorrectly, that eBay keeps every transaction record in perpetuity -- view the challenges of digital preservation through rose-colored lenses, but that's another story.) Of course, every era has witnessed the loss of valuable records, and whether current and future losses of corporate-controlled data will be large enough to distort the historical record is a complete unknown; the best we archivists can do is try to ensure that a sufficiently large body of such data is preserved and guide individuals seeking to preserve their own data.

The concept of filter failure and its limitations also came to mind as I read Dana Priest and William Arkin's Washington Post's Top Secret America Project, which highlights the growth of top-secret government work in the wake of the terrorist attacks of 11 September 2001. The project, which has a dedicated section on the Post's Web site, consists, among other things, of a trio of lengthy articles, a dedicated blog, a Twitter feed, a Facebook page, an excerpt from a related Frontline documentary that will air this fall, and a database of government agencies and federal contractors engaged in top-secret work, types of top-secret work, and places top-secret work is performed.

Among Priest and Arkin's key findings:
  • Almost 1,300 federal government agencies and roughly 1,900 private firms perform intelligence, counterterrorism, and homeland security work in approximately 10,000 locations throughout the country.
  • Many security and intelligence agencies perform the same work. For example, 51 civilian and military organization trace the flow of funds to and from terrorists. In most instances, organizations examining the same things do not share information.
  • Analysts who examine records unearthed and conversations recorded by spies are scattered throughout multiple agencies. Collectively, they produce approximately 50,000 daily, weekly, monthly, and yearly reports, many of which contain duplicative information. Agency heads and policymakers can't keep up with the constant flow of reports and tend to rely upon personal briefers who focus chiefly upon reports produced in-house.
  • Contractors do about 30 percent of the work that used to be performed by employees of the Central Intelligence Agency and other federal agencies. Most contracting firms are public corporations ultimately answerable to their shareholders, not the government -- an inherent conflict of interest. Moreover, they pay more than the government does and are actively recruiting senior analysts and other experienced government employees.
  • Diffusion and duplication of effort and lack of interagency communication have serious real-world consequences. In the months before the 2009 Fort Hood shootings, multiple agencies picked up snippets of worrying information about the eventual culprit, but these snippets never made their way to the organization responsible for internal Army counterintelligence investigations -- which had opted to focus on examining civilian terrorist affiliations in the United States even though the U.S. Department of Homeland Security and 106 Federal Bureau of Investigation task forces were doing the same thing. Moreover, multiple agencies had multiple tidbits of information about the 2009 Christmas Day bomber, but failed to "connect the dots" not only because they were flooded with data but also because no single agency was clearly responsible for doing so.
Filter failure -- all those reports! all those overlooked data snippets! all the other records these agencies create! -- seems to be a real problem for the nation's intelligence agencies, but merely developing better filters won't really fix things. However, hiring more people and making new information technology investments -- the preferred solution of Military-Industrial Complex 2.0 -- won't do so, either. Priest and Arkin's work makes it plain that the real reason that intelligence, homeland security, and counterterrorism information is proliferating and relevant information can't be identified is that the agencies that produce it grew quickly and with minimal planning, have overlapping, nebulously defined missions, lack the incentive or desire to share information with each other, and, in some instances, use secrecy and national security as mechanisms for compartmentalizing information and justifying their own continued existence. In other words, the real problem is not one of information filtering but one of information creation. Whether we as a nation have the political will needed to fix this problem remains to be seen.

No comments:

Post a Comment