Tuesday, June 29, 2010

2010 Best Practices Exchange: proposals due tomorrow

The Polly Rosenbaum Archives and History Building, the new home of the Arizona State Library, Archives and Public Records, 21 October 2008. If you attend this year's Best Practices Exchange, you'll probably get to see this superb facility.

The 2010 Best Practices Exchange (BPE) will take place in Phoenix, Arizona, Wednesday, 29 September through Friday, 1 October 2010. The deadline for submitting presentation proposals is tomorrow, 30 June 2010.

This year, each BPE session will focus on one of the following topics:
The BPE is not a conventional conference, and that's one of the reasons it's so wonderful. It attracts about 80-100 attendees, which means that people tend to get to know each other quickly. Presentations lean toward the informal and the collaborative, reporting on projects in progress is encouraged, and audience members are strongly encouraged to ask many questions and offer their own perspectives. Everyone is encouraged to discuss failures and problems as well as successes, and I've always found this aspect of the BPE to be particularly valuable. It's deeply comforting to realize that other people have made similar mistakes, and it's really helpful to know who to call or e-mail when problems akin to those discussed at the BPE rear their ugly heads.

The BPE tends to attract a lot of state government electronic records archivists and digital librarians, but it's open to just about everyone interested in preserving digital materials:
  • Local, state, and federal government and university archivists and librarians
  • Educators and researchers in the fields of library science, information science, technology, archives, and records management
  • Product developers working to create systems for managing and preserving digital assets
Owing to the generous support of the Library of Congress's National Digital Information Infrastructure Preservation Program, there's no registration fee for this year's BPE. This year's call for proposals also includes a handy-dandy proposal submission form, so you don't even have to go to the trouble of sending anyone an e-mail. If you have a project you would like to discuss, by all means take a few minutes to pull together a couple of paragraphs and complete that form. You won't regret doing so.

Thursday, June 24, 2010

NYAC/ARTNY: open source

Westbound on the Franklin Delano Roosevelt Mid-Hudson Bridge, late evening, 4 June 2010.

[I wrote this post a while ago, but I spent some time away from home and, for the most part, the Internet. Now that I'm back in cyberspace, the pace of posting is going to pick up a bit.]

While at the recent joint meeting of the New York Archives Conference and Archivists Roundtable of Metropolitan New York, I took part in Session 12, Using Open Source Software. I had a good time putting together and delivering my presentation, and both of my co-presenters were stellar. Although the three of us met only a few minutes before our session began, our presentations meshed well, in large part because all of us approached open source in the same pragmatic fashion; i.e., we largely avoided the open-source-versus-proprietary-software debate and encouraged people to use open source software when doing so met their business needs.

I discussed the Open Source Initiative’s definition of open source software, open source as a model of software development in which programmers work independently and collaboratively to write code and review each other’s work, and open source as a philosophical belief that sharing information and knowledge is good in and of itself and a spur to the development of more information and knowledge. I also detailed the practical advantages (e.g., no cost of acquisition) and disadvantages (e.g., support and technical documentation that may range from excellent to abysmal) and highlighted some open source applications of particular value to archivists working in smaller repositories:
I was really pleased that I got the chance to hear Seth Kauffmann, who spearheaded the development of CollectiveAccess, discuss its development, functionality, and support options. CollectiveAccess is a cataloging and Web presentation system that supports every descriptive standard (and local variant thereof) and that can be used to gather and manage information about people and places as well as digital surrogates of cultural heritage materials. After highlighting PhilaPlace and some other really cool projects powered by CollectiveAccess, he offered some sound words of advice to anyone involved in an archival software development project:
  • Planning is good. This point may seem painfully obvious, but it’s all too often overlooked. (I would hasten to add that “we’ll leave all the technical stuff up to the programmer/vendor because its his/her/its job” is not a plan)
  • Don’t assume that all of your problems can be solved with software or technology.
  • Involve real users in the development process. Seth stressed that one of CollectiveAccess’s strengths is its “community of self-interest development model” -- archivists who have grants to “do something real” drove its creation and guide its evolution.
  • Be realistic about the quality and extent of your existing metadata and digitized resources.
  • Involve archivists at the start of the development process and keep them involved throughout the project (no argument here)
Rick Cobello, the Schenectady County Information Technology Director, then detailed how the county is deploying open source software at the enterprise level. Fiscal pressures are forcing governments to choose between cutting operating costs and laying off staff, and open source software will enable the county to reduce costs and provide better services.

At present, roughly half of Schenectady County’s IT budget is devoted to fixing and decontaminating desktop computers. The county is now centralizing almost all of its storage and applications, and most county employees will have only a monitor, keyboard, mouse, and Pano Logic client that will enable them to access the county’s central servers. The county will no longer have to install antivirus protection and update software on desktops, and desktop support staff will be able to focus on other projects. Estimated cost savings: at least 30 percent.

The county attorney’s office is now using OpenOffice.org, and Rick plans to move other county offices to OpenOffice.org after the county’s licensing agreements with Microsoft expire. Although the county is currently using a mixture of open source and proprietary software and will continue to use specialized proprietary software (e.g., geographic information system applications) well into the future, Rick’s ultimate goal is to stop paying licensing fees for any of the software needed to support routine office operations.

Rick emphasized that the county has support contracts for almost all of the open source software that it uses: he believes in supporting organizations that create open source software, and paying for support is less expensive than paying licensing fees. I find this approach is both altruistic and smart: in addition to sustaining worthwhile projects, he's helping to ensure that the software he's using will be updated and enhanced.

Public-sector budgets always lag behind the economy. The coming years are going to be extremely tight, and I think that a lot of government IT directors are going to make many of the same decisions that Rick has made -- and look for other ways in which open source, among other things, can save money. It’s going to be really interesting to see just how things pan out.

Monday, June 14, 2010

NYAC/ARTNY: E-mail management and preservation

The Hudson River, as seen from Walkway Over the Hudson State Historic Park, Poughkeepsie, New York, 4 June 2010.

This post concerns something that happened 11 days ago, which I suppose makes me a full-fledged slow blogger . . . .

I noted a little while ago that I thought that Session 1, which focused on the Archivists’ Toolkit, was the highlight of the recent joint meeting of the New York Archives Conference/Archivists Roundtable of Metropolitan New York. However, Session 7, Management and Preservation of E-mail, was a very, very close second.

Nancy Adgent of the Rockefeller Archive Center (RAC) got things rolling by discussing one of my favorite preservation projects, the Collaborative Electronic Records Project (CERP) undertaken by the RAC and the Smithsonian Institution Archives (SIA).

Using RAC-held Microsoft Outlook .pst files e-mail in a variety of formats and SIA e-mail in a variety of formats Microsoft Outlook .pst files, the CERP team developed separate workflows that took into account their institutional differences. They also tested off-the-shelf conversion tools and and developed a parser that converts e-mail messages and attachments to an XML-based preservation format; over 99 percent of the 89,000 testbed messages parsed successfully.

Along the way, they learned a host of lessons:
  • Transfers from active systems go most smoothly when an archivist and an IT person work together.
  • Minor problems will arise. Some attachments that should have accompanied messages were missing, most likely because they were stored in a central server and the messages were copied from individual users’ desktops. The dates of some individual messages were replaced by the date they were bundled into a .pst file, and the name of the author was sometimes changed to that of the archivist who examined the file. Most strikingly, the installation of new GIS software on testbed equipment changed the display of some message fonts.
  • E-mail requires some processing before it’s converted to XML. The CERP team used a variety of off-the-shelf tools in order to do so.
  • Different anti-virus tools sometimes yield different results. The CERP team used Kaspersky and Symantec, each of which detected a few viruses that the other didn’t find.
  • Searching file names is not a foolproof means of identifying “sensitive” materials. Although both repositories conducted such searches and either removed sensitive information from access copies or documented its existence in finding aids and metadata, they realized that some sensitive information was still lurking in the messages.
Paul Szwedo of the New York State Office of Real Property Services (ORPS) then discussed how his 300-person agency began managing its e-mail. To the best of my knowledge, ORPS has made a lot more progress than any other New York State agency, and I hope that other agencies follow its example.

Prompted by recent changes in the Federal Rules of Civil Procedure, ORPS decided that staff had to take responsibility for their own information. It first reviewed, updated, and consolidated its e-mail, the Internet, and IT equipment use policies so that they harmonized with the state’s information security policy, ethics law, and relevant executive orders. It then amended its e-mail retention policy, which now mandates that users who want to keep their e-mail longer than 120 days must move it into a centralized archive.

ORPS already owned EMC’s DiskXtender archiving product and purchased the EmailXtender (now SourceOne) component (which the Obama White House also uses) for e-mail archiving. Instead of keeping every message, the agency opted to create folders based on length of retention period and rely upon staff to file messages appropriately. Folder access is customized by unit, so units that create records with longer retention periods can manage them properly and those that don’t can’t keep records forever.

After the system was installed, ORPS began providing training and guidance to staff. Staff had eight weeks to manage their existing e-mail, and Lotus Notes’ save-to-local-drive option was disabled. Project leaders sent out reminders to staff and stressed that managers were responsible for ensuring that unit records were managed properly and that unit staff knew how to do so.

Paul identified a number of key success factors:
  • Backing of senior management. (In a brief conversation we had after the session, he indicated that senior managers’ support was the single most important factor. If only figuring out how to secure the backing of senior management were as simple.)
  • Policy development preceded implementation. Business needs should drive IT investment, not the other way around.
  • Staff educated themselves via State Archives workshops and discussions with other agencies.
  • Availability of funding
  • Records management liaisons served as a test group, which facilitated identification of problems and prepared the liaisons to handle questions from other staff
  • All tutorials, videos, and communications relating to the project were placed online
The challenges were nonetheless substantial:
  • Turning a “save everything” organization into an organization that manages its information requires a lot of effort
  • Some people saw e-mail management as a distraction from their “real” work
  • Everyone wanted more time to review and sort messages
  • Upper managers retired, resulting in loss of momentum
  • Networking staff had other responsibilities thrust upon them
  • People save e-mail for easy reference, and don’t necessarily think of it as a record
Project staff also learned several lessons along the way:
  • Don’t assume people are paying attention. Despite repeated warnings and reminders, one person did not review and organize his/her messages and as a result lost all of them.
  • Elicit concerns, and do so upfront if at all possible (As Fynette Eaton has pointed out, this is a key principle of change management)
  • Weigh overhead against policy. On several occasions, ORPS had to tweak its policies because they were placing an undue burden on network personnel.
Christine Midwood of Iron Mountain Digital ended the session by highlighting how new products and services can help address the challenges associated with e-mail:
  • Legal risks and discovery. New products can provide consistent, rapid search across email archives, apply litigation holds by message (and apply multiple holds to a given message), and manage e-discovery cases so that teams can access only those messages responsive to the case they’re working on.
  • Expense. New products can apply retention schedules, streamline costs via outsourcing storage to a cloud environment (my take: the cloud isn’t ready for state and local government records), and consolidate archiving, business continuity, security, anti-virus, etc. functions into a single product.
  • Data loss. Technology can provide a tight, documented chain of custody, capture complete delivery information, and consolidate or eliminate message files stored on individual users’ hard drives.
  • Privacy. Software can now block or quarantine e-mail that contains prohibited or suspect content (e.g., Social Security Numbers) and provide role-based access to e-mail (whole message, metadata only, no access)
  • Productivity. New products offer “continuity” features that minimize e-mail outages, eliminate e-mail quotas, and automate application of retention policies.
In response to a question from an audience member, she made a really interesting point: Iron Mountain and other vendors work with companies that are keenly aware of the risks of keeping information too long, and as a result they get few inquiries about how to handle e-mail that has a permanent retention period. She noted that allowing end users to sort and classify their own messages might open the door to permanent retention, but I suspect that something more (e.g., migration/conversion, preservation metadata) is going to be needed.

Saturday, June 12, 2010

South Carolina Department of Archives and History needs your help!

Reading room of the South Carolina Department of Archives and History, early evening, 18 November 2009.

Earlier this week, Governor Mark Sanford issued a series of vetoes that threaten the state's records management and cultural heritage institutions. The vetoes include three cuts to the South Carolina Department of Archives and History (SCDAH) budget totaling $980,945; the State Legislature had approved a fiscal year 2010-2011 budget of $2,445, 764 in state funds plus $200,000 in stimulus funds for the agency for FY 2010-2011. Funding for the state's Confederate Relic Room and Military History Museum was eliminated entirely.

I know several archivists who work for SCDAH. All of them are whip-smart, devoted to their work, kind, and fun-loving, and I can't even begin to imagine how these cuts will impact their lives. However, the people of South Carolina will, in the long run, pay the greatest price. If allowed to stand, the governor's vetoes will make it all but impossible for SCDAH to fulfill its mission "to preserve and promote the documentary and cultural heritage of the state through archival care and preservation, records management, public access, historic preservation, and education." Anyone interested in ensuring government accountability, researching South Carolina ancestors, promoting meaningful and fact-based history education, or visiting historic sites is going to suffer as a result of these vetoes.

If you are a South Carolina resident and have used or plan to use SCDAH's programs or services, please contact your legislator and let him or her know how these cuts will affect you. Also, please contact Representative Dan Cooper, who chairs the House Ways and Means Committee and Representative Chip Limehouse, who chairs the House subcommittee that deals with SCDAH. The Legislature will be in session on Tuesday, 15 June to consider Governor Sanford's vetoes, so time is of the essence.

If you are an out-of-state resident, consider contacting Cooper and Limehouse and letting them know that you have either used or plan to use SCDAH's resources and that cutting its services will eliminate your incentive to spend your lodging, restaurant, shopping, and transportation dollars in South Carolina.

Thursday, June 10, 2010

Louisiana governors records bills defeated

Treme, a young green sea turtle found cold-stunned in December 2009, in her temporary home at the Audubon Aquarium of the Americas, New Orleans, Louisiana, 21 March 2010. Aquarium staff were planning to release Treme and another rehabilitated green sea turtle into the Gulf of Mexico sometime this summer. Owing to the Gulf oil spill, those plans are on hold and the aquarium has taken in 32 oil-slicked turtles, 3 of which have died.

Earlier today, proposed governors' records legislation sponsored by Representative Wayne Waddell (R-Shreveport) died in the Louisiana State House of Representatives. The legislation would have narrowed (but not eliminated) the gubernatorial exemptions in Louisiana's open records law. In addition, it would have compelled governors to transfer their records to the Louisiana State Archives upon leaving office and to open them to researchers 10 years after doing so. A similar bill sponsored by Senator Robert Adley (R-Benton) perished in the Louisiana State Senate last week.

When compared to the oil spill that is wreaking environmental and economic havoc along the Gulf Coast, public records legislation may seem like a trifling concern. However, it's not. Government records document the rights of citizens to vote, receive benefits for which they are eligible, and hold real and other property, and good management and proper disclosure of these records enables citizens to ensure that their government is acting honestly and responsibly. Gubernatorial records, in particular, document important policy and resource allocation decisions, and states such as Louisiana (and New York) do their citizens a real disservice by not insisting that these records be managed, preserved, and made accessible in a clearly defined and systematic manner.

Louisiana has suffered more than its fair share of disasters in recent years, and it seems all but certain that recovering from the oil spill will be slow, difficult, and painful. Ensuring that the records of the state's governors are managed properly and made accessible won't save the state's wetlands, wildlife, or economy, but it will enable the people of Louisiana to assess the words and deeds of their leaders, determine what worked and what didn't, and plan for the future.

Here's hoping that in 2011, Louisiana gets the gubernatorial records legislation its citizens deserve.

Wednesday, June 9, 2010

Happy International Archives Day!

International Archives Day 2010 poster courtesy of the Archivo Histórico Nacional, Spain.

Today is the third International Archives Day. This celebration hasn't gotten much traction in the United States, most likely because of the established nature of American Archives Month, but repositories and archives professional associations throughout the world are using this day to highlight the importance of archives to collective memory and governmental accountability and transparency.

Something very important to American archivists nonetheless happened today: the U.S. House of Representatives Subcommittee on Information Policy, Census and National Archives heard testimony from the Archivist of the United States and lots of other people concerning the importance of funding the National Historical Publications and Records Commission and passing legislation creating the Partnership for the American Historical Record (PAHR). I didn't get the chance to watch the hearings and have yet to learn how they went, but I have my fingers crossed. I really want to see PAHR pass this year! (Update, 10 June 2010: colleagues told me earlier today that the hearings, which continued today, focused solely on the NHPRC. PAHR will likely get hearings of its own.)

Here's a sampling of the repositories doing cool stuff for International Archives Day. Enjoy!

Monday, June 7, 2010

2010 Best Practices Exchange

It's Best Practices Exchange time again! It's Best Practices Exchange time again! Well, it will be in a few months.

The 2010 Best Practices Exchange (BPE) will take place in Phoenix, Arizona, from Wednesday, 29 September through Friday, 1 October. David Ferriero, the Archivist of the United States, and Laura Campbell, Associate Librarian for Strategic Initiatives at the Library of Congress and the leader of the National Digital Information Infrastructure and Preservation Program, will deliver keynote addresses. The program will also feature a one day pre-conference workshop on digital preservation management.

BPE 2010 is open to practitioners in government and university archives and libraries; educators and researchers in the fields of library science, information science, technology, archives, and records management; and product developers working to create systems for managing and preserving digital assets. In my opinion -- and I've had the good fortune to attend every BPE -- it's the the most stimulating, invigorating, inspiring, and fun archival professional conference series that I've ever attended.

The BPE is not a conventional conference. As in past years, the program will include exchange sessions with presentations by individuals working in the field, followed by facilitated discussion. These grass-roots sessions are informal and collaborative. Attendees are encouraged to ask many questions and offer their own perspectives. This year, the sessions will focus on:
  • Track 1: New ways of working: strategically rethinking digital curatorial strategy
  • Track 2: New tools: Practical, hands-on techniques, tips, and tools
  • Track 3: New media: Collecting and preserving social media, the rising use of audio and video formats
  • Track 4: Policy and administration: Sustainable programs and commitment to long-term preservation
  • Track 5: Additional tracks that evolve out of proposals from the community

In addition to submitted proposals, this year's program will feature presentations from the four States Initiatives projects funded by the Library of Congress's National Digital Information Infrastructure and Preservation Program (NDIIPP). The projects include:

The deadline for submission of proposals is 30 June 2010. More information about submitting proposals, making travel arrangements, and registering for the conference (there is no fee for doing so!)is on the Best Practices Exchange 2010 Web site.

Sunday, June 6, 2010

NYAC/ARTNY: Archivists' Toolkit

The Hudson River, as seen from the grounds of the Vanderbilt Mansion National Historic Site, Hyde Park, New York, 4 June 2010.

Last week, I attended the joint meeting of the New York Archives Conference (NYAC) and the Archivists Roundtable of Metropolitan New York, (ARTNY) which was held at Marist College in Poughkeepsie. Unfortunately, Mac-using attendees discovered upon arrival that, despite Marist’s promises to the contrary, they could not connect to Marist’s wireless network. Now that I’ve reconnected, I’ll put up a couple of posts about the highlights of this year’s conference.

In my view, the best session of the conference was Session 1, “Implementing, Modifying, and Teaching the Archivists' Toolkit.” The Archivists’ Toolkit (AT) is an increasingly popular open source tool that supports accessioning, location management, and description of archival materials, and the session itself attracted a capacity crowd.

Janet Bunde of New York University (NYU) discussed a recent effort to integrate the AT into NYU’s Advanced Archival Description course so that students, who typically lacked the funds needed to attend AT workshops sponsored by the Society of American Archivists, would become familiar with the tool and hone their descriptive skills. The students reviewed the AT user’s manual in advance, then devoted an entire class session to entering sample data into the AT. At the end of the class, students discussed where they entered specific data elements and the descriptive output that resulted. Although the discussion wasn’t as extensive as Bunde would have liked, it shed light on students’ descriptive choices and revealed that, despite the use of some odd terminology, the AT’s interface is relatively intuitive.

Bunde stressed that this exercise didn’t, in and of itself, teach archival description, but it made me think about how to do so. I created a handful of MARC records while working as a student assistant, but I really didn’t feel comfortable with description until I found myself responsible for reviewing MARC records created by archivists at other repositories. I soon acquired an intimate knowledge of MARC and the ability to differentiate between acceptable variations in local practice and out-of-bounds tag usage. I really like the idea of having students openly compare and defend their descriptive choices, and using the AT as a teaching tool has real promise, particularly if, as NYU plans to do this fall, it’s incorporated more fully into the course curriculum.

Deena Schwimmer of Yeshiva University discussed how her repository, which has only two professional staffers and few IT resources, used the AT to centralize, as quickly as possible, holdings and descriptive information about its manuscript collections. Working with a clerical assistant, Schwimmer first culled holdings information from donor files and the relatively small number of MARC records describing the collections and entered it into the AT. Then, working in tandem with an intern who created collection-level descriptions, she used the AT to create Encoded Archival Description (EAD) finding aids that contained only the most basic descriptive elements: Biographical/Historical Note, Scope and Content, Abstract, Conditions Governing Access, Conditions Governing Use, and Language of Materials, and Title and Date information. She also used the AT to manage the project: she added fields that identified whether an EAD finding aid had been produced and enabled her and her intern to exchange notes about specific collections.

Schwimmer’s project exemplifies what a single results-minded archivist can do with a well-chosen tool and a little student and clerical help. Before Schwimmer’s project began, approximately a third of Yeshiva’s 2500 linear feet of manuscript holdings had been described, and when the project wrapped up roughly 18 months later, every collection had at least a basic finding aid. I think we’re going to see lots of similar AT success stories during the next few years, and, needless to say, I think that this is a very good thing.

Marisa Hudspeth of the Rockefeller Archive Center (RAC) then discussed how her repository is building a new AT reference module that will both meet its needs and enable it to, via release of the module’s source code and documentation, give back to the archival community. The RAC had been using a proprietary tool that supported patron registration and tracking of duplication services, but moved to the AT because of its robust collections management and descriptive modules. When it became apparent that the AT development team's energies were focused elsewhere, the RAC decided to hire several former team members and build a reference module itself.

When it’s completed, the reference module will perform the following functions:
  • Patron registration: will track research visits, publications, completion of necessary research paperwork, and research awards; and facilitate generation of statistics and reports.
  • Duplication services: will manage all types of requests; create standardized invoices in PDF; store fee schedules and shipping rates and automatically calculate totals; track service requests; generate statistics and reports; and securely manage payment information.
  • Retrievals, bar-coding, and use tracking: will track use of materials by patrons; generate statistics and reports; automate the charge-out procedure using barcoding; add barcoding functionality to the AT’s Accession module; support printing of barcodes and box labels; and enable both archivists and researchers to submit pull requests electronically via clicking on boxes in the RAC’s EAD finding aids.
  • Reference requests and reading room scheduling: will electronically distribute reference requests to staff; allow staff to respond to requests within the AT; store request histories, staff assignments, and responses; generate statistics and reports; and enable archives that have limited research facilities to manage scheduling of research appointments and factor in holiday closings, weather incidents, and other events.
  • Personalized user accounts: will enable patrons to update their contact information, submit reference requests, schedule and cancel research appointments and sign up for waiting lists; receive notifications of closings and research room vacancies; sign up for newsletters and the like; view an orientation video and agree to the RAC’s terms of use; track the status of their duplication requests; review their own request histories; bookmark and comment on finding aids; submit funding paperwork; electronically sign forms, and, if they wish to do so, connect with other researchers.
At present, the RAC doesn’t know how this reference module will work with ArchivesSpace, which will, when completed, merge the AT and Archon, another open source archival data management system. However, the RAC will release the code and continue using it, even if the module can’t be incorporated into ArchivesSpace.

After this session ended, I was talking to a colleague about the RAC’s work, and we were both struck by the degree to which reference-related information systems remain paper-driven -- not only at our repository but also at many, many others. Our own repository is currently developing some of the functionality that will be included in the reference module (e.g., barcoding and use tracking), but we’re still terribly paper-centric. The RAC’s work ought to help propel the move away from paper, and it’s going to be really interesting to see how this exciting project pans out.

If you are an AT user and want to track reference requests, duplication services, etc., electronically, the RAC is looking for reference module beta testers. The module’s first component -- patron registration -- should be finished within a few weeks, and the entire module has a scheduled completion date of 31 December 2011, so things are moving right along. If you're interested in serving as a beta tester, contact Marisa Hudspeth at mhudspeth-at-rockarch.org.

Tuesday, June 1, 2010

CNN's bad e-redaction

Earlier today, Al and Tipper Gore announced that they were separating after 40 years of marriage. As if this weren't awful enough, CNN posted online a PDF copy of the e-mail announcement that they sent to their friends and supporters . . . but without properly redacting their private e-mail address. Someone at CNN apparently drew a black box over the e-mail address, but didn't remove the underlying metadata. As a result, CNN readers who clicked on the black box were able to view the e-mail address hidden underneath. The post has since been revised to exclude the e-mail address, but the original version of the post was up on CNN's site for a couple of hours. Ugh.

Regular readers of this blog will recognize that proper redaction of PDF files is one of my pet causes. If you ever need to redact a PDF file, here are a few tips that should help ensure that you won't end up like CNN . . . or the U.S. Transportation Security Administration, the U.S. Department of Defense, the Washington Post, the New York Times, Google, Facebook, or any of the other organizations that have been stung by their own (or their lawyers') lack of technical knowledge.