Skip to content
An official website of the OECD. Find out more
Created by the Public Governance Directorate

This website was created by the OECD Observatory of Public Sector Innovation (OPSI), part of the OECD Public Governance Directorate (GOV).

How to validate authenticity

Validation that this is an official OECD website can be found on the Innovative Government page of the corporate OECD website.

Design Thinking and Rapid Impact Evaluation for Public Sector Innovation

Combining Design Thinking and Rapid Impact Evaluation methodologies allows for a user-centered counterfactual to measure the impact of the current program design, while simultaneously identifying programmatic design flaws and generating solutions for improvement. The participative process for co-developing the counterfactual triangulates multiple user perspectives and data sets to provide new lines of evidence to pursue innovative solutions and offer a more iterative approach to evaluation.

Innovation Summary

Innovation Overview

To our knowledge, the ESDC Innovation Lab has pioneered the approach mixing Design Thinking and Rapid Impact Evaluation (RIE) to improve the responsiveness of a Gs&Cs program to user needs and provided lines of evidence for evaluation. Dr. Andy Rowe developed the RIE methodology with a team of evaluators in 2004 and 2005. Rowe intended for RIEs to generate evidence that might fill gaps in traditional summative evaluation processes, and do so in a shorter time than expected. By balancing numerous and divergent stakeholder perspectives, his approach gauges the impact of a program forecast future impacts in terms of probability and magnitude and weighs benefits.

The Lab diverged from past RIEs used in the Government of Canada which typically used the absence of the program as a counterfactual to measure impact. Instead, the Lab developed a human-centered counterfactual engaging diverse perspectives of stakeholders through multiple workshops, co-creating and stress-testing solutions to current challenges with participants. Early in the pilot, the Lab met end-users in the field with a semi-structured group interview using emotional triggers to garner personal experience. This wide in scope preliminary fieldwork was intended to capture challenges experienced by end-users. The information identified emerging themes that were then substantiated against the current program efforts and translated into a visual graphic recording - effectively summarizing and communicating complex user stories.

The Lab then held two workshops involving 15 and 80 participants respectively. Representation included government stakeholders from across departmental branches, beneficiaries of program funding, failed applicants for funding, front-line providers, end-users, businesses and not-for-profit organizations serving end-users. Participants involved in the design thinking process comprised of diverse communities: women’s groups, groups representing ethnic minorities (Indian, Sri Lankan, Somali, Jewish, Chinese, Indigenous, etc.), the LGTBQ community, those with different ability, and covered a breadth in age between 50 to 80 years old - the target age range for the program.

For each workshop, the Lab developed original and tailored collaborative activities precisely engineered to cover gaps revealed in the fieldwork and in-house knowledge-sharing regarding the program. Assumptions were repetitively checked to keep an eye to Dr. Andy Rowe’s five stages of RIE against the five design thinking stages. Both methodologies mixed well. The Lab workshops included a reconsideration of the Theory of Change through problem framing divergence and empathy; considered direct impacts of potential new designed interventions as part of the program design (e.g. emerging ideas for counterfactual scenarios); identified who would be most affected by these new proposed designed interventions, how users would be affected, and on what scale.

The Lab facilitated discussions to explore solutions to current programmatic tensions through gamification and pushed the boundaries of polarity-thinking through back-casting. User empathy was engineered with meta-ethnographic approaches including empathy mapping, persona development based on fieldwork qualitative research and direct interactions with frontline personnel serving end-users in diverse communities. The Lab analyzed narrative threads, triangulating and reporting on these insights for the use of the RIE.

A final multi-component counterfactual materialized with the Evaluation Directorate, IBA consultants and the ESDC Innovation Lab discussing findings. Three alternate components to the current program design were proposed: 1) greater internal coordination between program components, 2) an open portal, and 3) greater cross-sector collaboration for new partnerships. From a design thinking perspective, these alternate program scenarios were seeds for prototyping to refine, flesh out and test further. For the RIE methodology, a final workshop hosted program owners and operations (technical advisors), academic researchers (subject-matter experts), and frontline organizations serving seniors who were successful and unsuccessful applicants. At this workshop, participants measured the three-pronged counterfactual against a matrix of current program indicators and additional indicators leveraged from user stories garnered during the Lab workshops, and measure impact for the legality, feasibility, and desirability against the current program design.

In the end, two components were retained by the program. The co-development process for developing this three-pronged counterfactual opened a window to provide new evidence to further investigate in a summative evaluation and/or inform immediate programmatic changes. This approach leveraged divergent and convergent perspectives to challenge the fundamentals of a program and wedged a space for incremental and more radical innovation exploration.

Innovation Description

What Makes Your Project Innovative?

RIEs have been used a handful of times by the Government of Canada. The Government of Canada Guide to Rapid Impact Evaluation lists pilots at three departments: Public Health Agency of Canada, Public Safety Canada, Natural Resources Canada.

In preliminary research for this RIE, the Lab consulted reports and interviewed individuals involved in past trials. These RIEs used the absence of the program as the counterfactual. Co-developed with beneficiaries/end users, frontline organizations, internal stakeholders, technical program experts, businesses and subject-matter experts, the Lab sought to diverge from past approaches by developing a human-centered alternate scenario. This approach unfetters possibilities that can be unattached to the current programmatic practices and legacy, and as such, allows for the balanced experiences of end-users to create a robust counterfactual, challenge potential institutional bias, and push for greater innovation.

What is the current status of your innovation?

The findings of the RIE pilot have been shared internally to program owners and to the evaluation team who will be conducting a summative evaluation of the program this year. The ESDC Innovation Lab, the Evaluation Directorate and IBA consulting have met regularly to share lessons learned. This fall, the Lab has started to deliver presentations across the Government of Canada's innovation ecosystem and internationally to share lessons learned from this RIE pilot.

Throughout the project, the Lab systematically examined the dimensions of this new approach to RIE as it evolved. This was done in a reflective study through post-mortem discussions with the Evaluation Directorate and IBA consultants. The Lab conducted post-mortems following every workshop and fieldwork activities to push the development of its design practices and conducted key informant interviews with participants to garner feedback and improve its approach.

Innovation Development

Collaborations & Partnerships

-Citizens shared irritants/feedback/participated in workshops

-Program owner/technical experts/SMEs stress-test counterfactual for the legality, feasibility

-Lab provided a direct link between government, citizens and external organizations. Lab interventions yielded user insights that framed a counterfactual.

-Evaluation Directorate provided a conduit between the Lab and government officials

-IBA consultants triangulated views and assessed counterfactual

-Ottawa Council on Aging hosted one event

Users, Stakeholders & Beneficiaries

-Citizens: Beneficiaries/users identified design flaws + potential solutions to irritants/needs + program improvements
-Program owner: counterfactual + user insights vs. their current design
-Other Gs&Cs programs: user insights
-Evaluation Directorate: learned a new methodology with design thinking for other projects
-Lab: Learned about RIE, with design thinking + org development, a great combo for policy innovation, program/service design. Looking for new project to test and scale

Innovation Reflections

Results, Outcomes & Impacts

The RIE provided government officials with up-to-date information on program outcomes and client experience and assessed the utility of RIE as an innovative approach to evaluation. The human-centered approach to RIE generated a three-pronged counterfactual that responded to the challenges and experience of end-users and addressed institutional bias by including cross-sector and front-line perspectives throughout the design process – from early in the discovery fieldwork process to the final testing of the program impact against the counterfactual. The three alternative scenarios informing the counterfactual were feasible and realistic. Results and findings of this pilot project have been shared internally and repurposed to inform future iterations of calls for proposals for this program. A case was demonstrated that the approach to bring design thinking to RIE methodology has innovation potential + should be repurposed to other programs, policies, and service delivery areas.

Challenges and Failures

1) Our approach would benefit from a greater investigation into the appropriate balance between qualitative and quantitative data frameworks and methods of analyses. The Lab noticed evaluators more steeped in a quantitative study, and government officials preferred metrics by which to understand qualitative findings. This, however, created a tension as translating qualitative insights into quantifiable metrics threatened to eliminate certain crucial nuances.

2) There is more comfortable to explore counterfactual within the existing programmatic boundaries. RIE with design thinking, however, demonstrated the potential to challenge program fundamentals, paving the way to counterfactuals that could be unattached to existing program boundaries and legacy. A counterfactual is an alternative reality to the existing design that responds to unmet needs and can yield promising evidence worth investigating.

3) The matrix to assess counterfactuals should be co-developed with users - not just SMEs.

Conditions for Success

The success of this mixed approach lies with having leadership support to allow for the exploration of counterfactual that may be very different from the existing program design. This divergence allows us to garner new evidence that could challenge potential conscious/unconscious bias and bring innovative breakthroughs.

The approach requires time. This pilot was not so rapid after all - it took 7-8 months to be completed because we had to analyze/curate a lot of qualitative data, identify participants, and meet routinely among partners to craft our approach moving forward. It would likely be 1-2 months faster if we were to do it again, but we would not sacrifice the development of a human-centered counterfactual for the sake of time.

This RIE requires technical expertise in design thinking and organizational development to facilitate difficult conversations and design tailored interventions to a problem space.
Counterfactual may be used for a component of a program - not all.

Replication

Design Thinking paired with RIE simultaneously evaluates the mistakes of the past and develops insights for the future. Design Thinking provides data rapidly to measure impact while making implementation more iterative, flexible and responsive. This combined approach offers a channel to course-correct design failures and operational flaws more quickly. We believe the approach we created for RIE can apply to all phases of policy development, from policy options to program design, program operations, and service delivery. It is a cost-effective way to do rapid-prototyping with inclusive design in mind and to collect new evidence for policy, program iterations, and service design.

Lessons Learned

1) For future RIEs, we believe it is important to focus on balancing the fundamentals of both Design Thinking and RIE in order to preserve the value inherent in each. More specifically, the empathetic process defined by Design Thinking offers a unique opportunity to evaluate components of greatest impact to a user. Design Thinking also exposed factors that impeded or facilitated organizational readiness for change to respond to user needs. There could be a push to expedite the gathering of user insights to fit the ''rapid'' requirements/expectations. Make sure you spend enough time to collect, curate and segment your qualitative information. Proxy approach, e.g. using internal staff instead of end-users, does not yield the same outcomes.

2) The assessment of the counterfactual should bring together participants who have been involved throughout the sensemaking stages of the Lab workshops. For this pilot, participants in the final workshop had not been involved in the previous Lab workshops and it was challenging for them to appreciate the nuances of the proposed components of the counterfactual and also accept the counterfactual assessment matrix. In hindsight, a different process for the assessment may be more valuable, perhaps using more than one tool to assess impact. For example, a ritual dissent process with scorecards/matrix could test and enhance counterfactuals by subjecting them to ritualized dissent (challenge) or assent (positive alternatives) rather than one matrix assessed on your own.

3) Unpolished extreme ideas were left on the table from the Lab workshops to satisfy the government's technical interpretation of feasibility or legality for RIE requirements. Having more than one counterfactual to measure against the existing design could allow for a few ideas to stretch innovation a bit further.

4) This RIE had no design specifications/boundaries which led to a bit of scope creep. Counterfactuals could benefit from a narrower problem statement.

Anything Else?

The ESDC Innovation Lab is an in-house human-centered design lab that operates at the nexus of policy development, program operations, and service delivery. The Lab applies and mixes behavioral science methods such as behavioral insights, organizational development, and design thinking to improve government programs and services housed in ESDC, developing innovative solutions to complex social problems with Canadians for Canada. Jordana Globerman and I, Catherine Charbonneau had the vision and resolution to adapt Dr. Andy Rowe's RIE methodology to human-centered innovation and experimentation in the public sector. We approached this project with an open mind and believed our approach would have enormous potential to scale and to be repurposed across sectors. It provides both promising hard data and soft qualitative evidence to inform decisions, and it identifies and mitigates risks early by working in alternative reality before launching new programs, policies, and services.

Year: 2019
Level of Government: National/Federal government

Status:

  • Diffusing Lessons - using what was learnt to inform other projects and understanding how the innovation can be applied in other ways

Innovation provided by:

Files:

Date Published:

18 November 2019

Join our community:

It only takes a few minutes to complete the form and share your project.