October Educational Luncheon:
From soup to nuts, inclusion, evaluation, and impact
Don't collect data just to collect data.
What do you really need to know?
Any fundraiser can safely say that it is vital to ensure we are effectively sharing the impact of our organization’s programs and mission with funders and other stakeholders. But where do we start? How do we determine what method is best for our specific organization?
Approximately 60 members and guests learned why and how to gather and evaluate impact data at the October 24th AFP educational breakfast, “Impact Data: From soup to nuts, inclusion, evaluation and impact.” Jennifer Charpentier, PhD, CFRE, Executive Director for the Gateway Technical College Foundation moderated a panel discussion with the following experts:
- Emily Connors, M.S., Research and Planning Analyst for ProHealth Care
- Lorna Dilley, Impact Manager for Milwaukee Succeeds Initiative at the Greater Milwaukee Foundation
- Kenneth Evans Johnson, ABD, co-founder of HJ Impact.
Panelists discussed starting the evaluation process by determining which type of evaluation is needed. Generally, there are four types: needs assessment, process evaluation, outcome evaluation and impact evaluation.
To determine which type of evaluation best suits your needs, panelists recommended that you have a conversation with your funders. See what data they require, what questions they need to have answered, and what framework drives those questions. If you have multiple funders with different requirements, determine your own process and communicate it with the funders.
Next, an evaluation team needs to be determined. Be sure to take diversity into consideration (ethnicity, age, LGBTQ, disability, etc.) when assembling your team and during the evaluation process itself. Different groups are impacted differently by your programs. Learning these differences will help your organization adapt and modify programs accordingly.
While data collection will vary based on the type of evaluation you conduct, one panelist shared there are always common types of data collected and used the acronym BACKS to describe: behavior, attitude, circumstance, knowledge and skills.
Proper funding is key to maintaining a quality evaluation process; evaluations should even have their own budget line to keep it as a formal part of the program. Evaluation should also be done as soon as possible. As one panelist noted, “Data is like bread, not wine. It’s better when it’s fresh.”
All these tips circle back to the “why” of evaluating the data in the first place. In addition to helping achieve maximum program effectiveness, impact data is critical to our jobs as fundraisers. It provides the basis for the stories we tell of our mission and is direct evidence of how funders can transform their communities with philanthropy.
For those looking for more information, panelists recommended the Centers for Disease Control (https://www.cdc.gov/eval/tools/index.htm) and the American Evaluation Association (https://www.eval.org/).