Design Thinking and Rapid Impact Evaluation for Public Sector Innovation

Combining Design Thinking and Rapid Impact Evaluation methodologies allows for a user-centered counterfactual to measure the impact of the current program design, while simultaneously identifying programmatic design flaws and generating solutions for improvement. The participative process for co-developing the counterfactual triangulates multiple user perspectives and data sets to provide new lines of evidence to pursue innovative solutions and offer a more iterative approach to evaluation.

Innovation Summary

Innovation Overview

To our knowledge, the ESDC Innovation Lab has pioneered the approach mixing Design Thinking and Rapid Impact Evaluation (RIE) to improve the responsiveness of a Gs&Cs program to user needs and provided lines of evidence for evaluation. Dr. Andy Rowe developed the RIE methodology with a team of evaluators in 2004 and 2005. Rowe intended for RIEs to generate evidence that might fill gaps in traditional summative evaluation processes, and do so in a shorter time than expected. By balancing numerous and divergent stakeholder perspectives, his approach gauges the impact of a program forecast future impacts in terms of probability and magnitude and weighs benefits.

The Lab diverged from past RIEs used in the Government of Canada which typically used the absence of the program as a counterfactual to measure impact. Instead, the Lab developed a human-centered counterfactual engaging diverse perspectives of stakeholders through multiple workshops, co-creating and stress-testing solutions to current challenges with participants. Early in the pilot, the Lab met end-users in the field with a semi-structured group interview using emotional triggers to garner personal experience. This wide in scope preliminary fieldwork was intended to capture challenges experienced by end-users. The information identified emerging themes that were then substantiated against the current program efforts and translated into a visual graphic recording - effectively summarizing and communicating complex user stories.

The Lab then held two workshops involving 15 and 80 participants respectively. Representation included government stakeholders from across departmental branches, beneficiaries of program funding, failed applicants for funding, front-line providers, end-users, businesses and not-for-profit organizations serving end-users. Participants involved in the design thinking process comprised of diverse communities: women’s groups, groups representing ethnic minorities (Indian, Sri Lankan, Somali, Jewish, Chinese, Indigenous, etc.), the LGTBQ community, those with different ability, and covered a breadth in age between 50 to 80 years old - the target age range for the program.

For each workshop, the Lab developed original and tailored collaborative activities precisely engineered to cover gaps revealed in the fieldwork and in-house knowledge-sharing regarding the program. Assumptions were repetitively checked to keep an eye to Dr. Andy Rowe’s five stages of RIE against the five design thinking stages. Both methodologies mixed well. The Lab workshops included a reconsideration of the Theory of Change through problem framing divergence and empathy; considered direct impacts of potential new designed interventions as part of the program design (e.g. emerging ideas for counterfactual scenarios); identified who would be most affected by these new proposed designed interventions, how users would be affected, and on what scale.

The Lab facilitated discussions to explore solutions to current programmatic tensions through gamification and pushed the boundaries of polarity-thinking through back-casting. User empathy was engineered with meta-ethnographic approaches including empathy mapping, persona development based on fieldwork qualitative research and direct interactions with frontline personnel serving end-users in diverse communities. The Lab analyzed narrative threads, triangulating and reporting on these insights for the use of the RIE.

A final multi-component counterfactual materialized with the Evaluation Directorate, IBA consultants and the ESDC Innovation Lab discussing findings. Three alternate components to the current program design were proposed: 1) greater internal coordination between program components, 2) an open portal, and 3) greater cross-sector collaboration for new partnerships. From a design thinking perspective, these alternate program scenarios were seeds for prototyping to refine, flesh out and test further. For the RIE methodology, a final workshop hosted program owners and operations (technical advisors), academic researchers (subject-matter experts), and frontline organizations serving seniors who were successful and unsuccessful applicants. At this workshop, participants measured the three-pronged counterfactual against a matrix of current program indicators and additional indicators leveraged from user stories garnered during the Lab workshops, and measure impact for the legality, feasibility, and desirability against the current program design.

In the end, two components were retained by the program. The co-development process for developing this three-pronged counterfactual opened a window to provide new evidence to further investigate in a summative evaluation and/or inform immediate programmatic changes. This approach leveraged divergent and convergent perspectives to challenge the fundamentals of a program and wedged a space for incremental and more radical innovation exploration.

Innovation Description

Innovation Development

Innovation Reflections

Leave a Reply

Your email address will not be published. Required fields are marked *

Year: 2019
Level of government: National/Federal government


  • Diffusing Lessons - using what was learnt to inform other projects and understanding how the innovation can be applied in other ways

Innovation provided by:


Join our community:

It only takes a few minutes to complete the form and share your project.