Moving from “Nice” to “Necessary”
In a previous blog post about the evolving role of outreach for historical organizations, one of the points I introduced was the need to better evaluate and document programs. Too often we are seen as a “nice” place where our audiences can enjoy our museums and properties on an occasional basis—or elementary school field trip students come for an annual experience of short duration.
At a time when sources of funding for cultural and humanities organizations are under threat, we have to be in a position to share data that frames our work as a “necessary” component of the social/educational fabric of our communities and states. The History Relevance Campaign is making this argument at a high level, but how are individual organizations building evaluation capacity to make this case to their stakeholders?
About fifteen years ago one of the major corporate foundations in Minnesota hosted a summit meeting for all of the nonprofit organizations that had received funding through their grants program. The key message was that they would no longer support proposals where the metrics of success were the number of people served and anecdotal reports that they “enjoyed it.” There was a lot of consternation in the room as it became clear that a whole new set of skills related to evaluation were going to be required for future funding. Building logic models, defining outcomes, and developing survey instruments aligned to those outcomes were to become the norm.
I think that many of the participants that day were willing to embrace the new challenge, but it was clearly unsettling that their organizations were no longer considered as having “inherent value” to the community. Even if they embraced a new evaluation culture, what could they effectively measure?
Museums with experience in visitor studies had a slight head start in this new world, but the assessments for the impact of exhibits and programs on audiences were a different challenge than the processes that informed the creation of those experiences.
Conny Graft has taken a number of SHA cohorts through the transformative work she organized at Colonial Williamsburg. My key takeaways from her presentation were that you had to approach evaluation with an open mind, embrace findings that challenged your assumptions, and be willing to act upon the data. Too often I have seen evaluation framed as a form of institutional affirmation. Surveys of visitors tried to ascertain whether they “liked us,” “really liked us,” or “really really liked us.” Add all of those percentages together and your “like index” is likely to be really high…but questions of impact, the attitudes of non-visitors, and new ways to better engage audiences can go undiscovered.
The evaluation mantra that Conny shared about challenging yourself to think about how exhibits and programs have an impact on what visitors “Know, Feel, and Do” was also profound as this way of thinking informed outreach evaluation. Working outside the bounds of a historic site or museum makes evaluation more complex as you don’t always have a controlled environment. Schools and public program venues create fluid relationships where, in many cases, the presenter is the guest.
It is that fluidity that also creates unique opportunities for meaningful, and measurable, engagement. Freed from the fixed assets of museum environments, it is in some ways easier to act upon evaluation findings and ask questions the reveal high degrees of impact and relevance.
Here are some examples of both “actionable” and profoundly “reportable” data I have measured over the past ten years:
- A large group of K-12 social studies teachers were asked whether “providing resources that would amplify the traditional narratives of U.S. with state and local examples would increase student engagement?” 92% reported YES.
- 78% of History Day parents reported that their child’s experience with the program increased college readiness.
- 88% of History Day students in Minnesota reported that their projects made them more confident about their future success in school.
- Overall, the academic and social-emotional learning metrics for History Day students of color were higher than those of white students.
With this data in hand we were more confident in our plans to provide supplemental U.S. history resources, and to make a funding case to donors and governmental stakeholders that the Minnesota Historical Society could be seen as an effective agent of change in breaking down achievement gaps.
One of Conny’s final presentation points was that, “Evaluation is not a process…it is a way of thinking about everything you do.” If something is worth doing, it is worth measuring. Because in the end, the essence of evaluation is storytelling. Thoughtfully collected data combined with compelling images and supporting narratives can dramatically change the way people feel about your organization. In turn, it is easier to make the case that historical organizations are profoundly necessary to your communities.
Evaluation will be one of the key topics at the SHA Wednesday Workshop during the 2017 AASLH Annual Meeting in Austin. I hope to see many of you there…and don’t forget to fill out those evaluations.
Tim Hoogland is Director of Education Outreach Programs at the Minnesota Historical Society and Affiliated Instructor of History for the University of Minnesota. He is a member of the SHA Class of 2008.