Since the introduction of a US Open Data Policy in 2013, domestic agencies - particularly in the foreign assistance sector - have made great strides in opening up their financial, programmatic, and evaluation information. As recently highlighted by the Center for Global Development:
The [USAID Development Experience Clearinghouse (DEC)] claims to contain nearly 200,000 program documents spanning the last 50 years. In 2014 alone, USAID published just over 150 evaluations, totaling more than 18,000 pages. We applaud USAID for this tremendous transparency. Posting these evaluations in the public domain is the hallmark of what a transparent development actor should be.
Yet despite recent work to create a new DEC Application Programming Interface (API), USAID evaluation information ultimately lies within a warren of PDFs. As Christopher Ingraham asked Washington Post readers, what if the solutions to all our problems are buried in PDFs that nobody reads? While researchers at CGDev have reviewed DEC PDFs, the likelihood that others have (or will) is rather low. According to the World Bank, only 13 percent of its documents are downloaded more than 250 times; one third haven’t been downloaded at all. Ever.
Downloadable available PDFs are great, but we can do better. What if we could transform that locked-down information into easy-to-search, machine readable formats? That capability is already being developed - at least for USAID.
To “liberate” DEC information, AidData is piloting a new dataset under its ongoing partnership with the USAID Global Development Lab / Higher Education Solutions Network (HESN) - the dataset will be launched by USAID in the near future at usaid.gov/data. These machine-readable data will provide new levels of access for researchers, data scientists, and practitioners to mine, analyze, and create “new” knowledge from old lessons. In practice, this means that the next time CGD analyzes USAID evaluations, they will be able to run free-text queries through many evaluation PDFs, searching for the use of specific evaluation methods in an automated way. This could make possible a meta-analysis of USAID evaluations to analyze how evaluation methods have changed over time, which methods or types of projects tend to receive favorable/unfavorable evaluations, or other research questions of interest.
Of course, simply unleashing this information doesn’t mean we’ve solved everything. We still need to learn how to effectively interpret data for better decision-making and creatively employ it to spark on-the-ground innovations. As CGD suggests, this would be much easier if DEC documentation was properly associated with the programmatic and financial data contained in the Foreign Assistance Dashboards. As a first step, USAID and AidData are collaborating to organize a Hackathon on April 30-May 1 at the OpenGov Hub using the DEC, Foreign Assistance Dashboard, and other data sources in tandem to identify trends and stories in the impact and effectiveness of US Foreign Assistance. Stay tuned for more updates along the way.
This article was originally published on AidData's The First Tranche.