& Benchmarking in Steps
Assessing program performance relative to expectations and external quality standards
The word program often describes specific actions for which specific resources have been allocated in order to achieve specific outcomes. Common synonyms for program include initiative, project, plan, and priority. Each embodies an underlying intentionality or purposefulness or promise.
At Performance Fact, we encourage clients to undertake program evaluations to assess how well their actual experiences match the expectations. We also support benchmarking – determining how well the program stacks up against best in class or how it compares with widely accepted standards.
A program evaluation may address the following areas and questions:
- Effectiveness: To what extent did the program achieve its intended outcomes?
- Efficiency: Do the program’s practices reflect wise use of resources (people, time, and money)?
- Implementation: Was the program put in place as originally designed; if not, why?
- Linkage: Can the outcomes produced be attributed to the program rather than to other, simultaneous factors?
- Value-add: Do the benefits and outcomes derived from the program justify the cost of producing them?
These same program evaluation questions can be adapted to program benchmarking, which involves adding a comparison program or set of recognized excellence standards. The comparison could be a program in the same organization/sector or in an entirely different organization/sector.
Both evaluation and benchmarking offer insights that contribute to improving the program. Each serves as a mirror, showing us how we can get even better if we choose to learn from our findings.
▶ Recommended time frame for the Program Evaluation and Benchmarking process: Middle or end of year
STEP BY STEP
Design a focused evaluation process.
- Reaffirm original program intent and objectives
- Engage stakeholders regarding proposed evaluation design
Gather credible evidence and analyze the data.
- Collect soft and hard data, consistent with the program’s aims
- Analyze the data from multiple perspectives, including contrarian voices
- Delineate evaluation design questions that were problematic, inconclusive, or not addressed
Summarize key findings and justify your conclusions.
- Highlight major conclusions and recommendations, with supporting evidence
- Agree on path forward, including dissemination and follow-up, as appropriate