Conquering the Dusty Shelf Report: Data Visualization for Evaluation

This is a guest post by Johanna Morariu (@j_morariu) and Ann Emery (@annkemery) who work for Innovation Network, a nonprofit evaluation, research, and consulting firm in Washington, DC.

You convened stakeholders from all aspects of the evaluation: grantees, their funders, community members, even service recipients. You designed a theory of change that demonstrates how a national initiative will achieve its goals over the duration of a grant. You collected data through interviews, surveys, and other assessments. You spent dozens (hundreds?) of hours analyzing the data. You summarized key findings through a 30-page evaluation report—and even included 30 more pages of appendices with details about the analyses. After 12 months of blood, sweat, and tears, you’re finally ready to share the evaluation findings with your client.

We’re battling that Dusty Shelf Report daily: by assessing evaluation capacity,  by building that capacity,  and through participatory data analysis.

Our strongest weapon against the Dusty Shelf Report is data visualization. These three tactics have proven most powerful:

Tactic #1: Captivate the readers with visuals. We’re also battling the Sea of Similar Reports. Our readers are not immune. Like us, they’re entrenched in a constant downpour of reports, memos, and emails. To make matters worse, the typical evaluation report is about 80% text and 20% visuals—paragraphs of text with one small chart per page. In our State of Evaluation 2012 research, we used the opposite tactic: 80% visuals and 20% text. As we suspected, the visualizations started conversations, made the information more accessible, and helped the (dry) research results stay afloat in the Sea of Similar Reports.

image1_soe_screenshot

To grab the reader’s attention, we often deviate from the typical one-chart-per-page evaluation report format.

Tactic #2: Choose the design that’s right for the client–not the design that’s right for the data. Yep, we know. This is a major dataviz sin. We hear you. Last year, we developed a social network analysis (SNA) for a foundation. Our SNA had by-the-book design principles. But there was a major problem. It didn’t fit their dataviz literacy level. They were dataviz novices, and preferred to ‘visualize’ their data in a table. The SNA was completely unusable, thrown into the heap of Dusty Shelf Reports. Now, we’re strategically moving them forward, bit by bit, to get them to a point where they benefit from SNA. Until then, we’re using tables. Lesson learned: Have a contingency plan. What’s beautiful in theory doesn’t always apply to real-world situations.

image2_sna_innonet

In evaluation, we use social network maps to visualize relationships between key players in a network.

Tactic #3: Strengthen the dataviz literacy of your troops. Even though we’re visual thinkers, our clients might not be. Lead with visuals that your readers are familiar with (like bar charts). Emphasize data they’re interested in (like contextual details). With a few adjustments, you can overcome even the most lackluster bar charts. In this example, we combined a closed-ended survey question (the stacked bar chart) and a corresponding open-ended survey question (the quotes). Uniting quantitative and qualitative data provides the context they’re craving. Once they’re engaged in the conversation and comfortable with the basics, you can move on to advanced visualizations.

image3_stacked_bar_chart

Throughout the evaluation field, surveys are one of the most common ways to collect data.

So, how do you combat the Dusty Shelf Report? Leave your comments below or get in touch via our twitter accounts: Johanna Morariu (@j_morariu) and Ann Emery (@annkemery). Thanks!

5 Comments

Janet M. MurrayMay 7th, 2013 at 3:29 pm

Great to see this level of visualization and high production quality.

Question: What graphics/desktop publishing program are you using?

George ResslerMay 8th, 2013 at 1:45 am

As a designer and researcher, I love tactic two because a key foundation in design is user centric design- in other words don’t make things for yourself make them for the users. This is true with data viz too. I believe in taking research insights and spending a great deal of time to understand how they will live within an organization. Once you understand how the organization will use the insights, they can be designed to activate and empower that organization.

Like that old tree falling in a forest saying…If a research report is never read, does it have any value?

Ann EmeryMay 9th, 2013 at 3:49 pm

Janet,

Thanks for the kind words about the visualizations! And good question re: how we made them.

The first image is a screenshot from our fuller State of Evaluation report. We drew, doodled, and sketched the designs on paper, and then made the majority of the charts using Excel and pasted them into Word. When our analyses and visualizations were complete, we passed our report to the graphic designer, who made sure the fonts and color schemes were just right. I think she used an Adobe product.

The second image is a social network analysis that Johanna made in Gephi. For more info, you can read a related blog post by Johanna here: http://aea365.org/blog/?p=5689

The third image is simply some Excel elbow grease. We make the bar charts in Excel, paste them into Word, and use the Insert Shapes option to add text boxes and the gray connecting lines.

You can get pretty creative with free and nearly free dataviz software options these days! Hope this is helpful.
Ann Emery

Data Viz News [6] | Visual LoopMay 11th, 2013 at 1:59 pm

[...] Conquering the Dusty Shelf Report: Data Visualization for Evaluation | Visualising Data [...]

The Week That Was – 2013-05-13May 14th, 2013 at 2:15 pm

[...] Report: Data Visualization for Evaluation,” Visualizing Data, 07-May-2013. [Online]. Available: http://www.visualisingdata.com/index.php/2013/05/conquering-the-dusty-shelf-report-data-visualizatio…. [Accessed: [...]