Skip to content
An official website of the OECD. Find out more
Created by the Public Governance Directorate

This website was created by the OECD Observatory of Public Sector Innovation (OPSI), part of the OECD Public Governance Directorate (GOV).

How to validate authenticity

Validation that this is an official OECD website can be found on the Innovative Government page of the corporate OECD website.

Experimentation Works (EW)

Experimentation Works (EW) is a Government-of-Canada initiative to build public servants’ capacity in experimentation skills and practice through a learning-by-doing model that supports and showcases 5 small-scale experiments in the open. EW seeks to generate practical examples of experiments and ensure open access to learning materials, progress updates and results for broad impact. It works by connecting project teams with each other, and with experts in a open-by-default “cohort model."

Innovation Summary

Innovation Overview

The Government of Canada has an ambitious agenda for experimenting with new approaches and measuring what works to support evidence-based decision-making and instilling a culture of rigorous experimentation into government. Despite a range of separate efforts, the government’s vision for rigorous experimentation exceeds both the level-of-readiness across the public service (see challenges below) and the number and type of resources dedicated to support and enable departments and agencies to advance this work. As a result, there has been a risk that the Government of Canada commitment to build capacity and connect experimentation with evidence-based decision-making will not be met. Specific challenges include lack of understanding (e.g. difference between innovation and experimentation), lack of known examples of experimentation, lack of capacity and access to expertise, lack of training, lack of open sharing and reporting, lack of horizontal networks, and lack of supporting resources. Amidst these challenges, there are inspiring examples and lessons to learn from: how the Canadian federal policy innovation community has evolved and matured over the past five years; how leading-edge countries are embracing experimentation, and how our current Canadian public service is embracing a collaborative, open, agile, and action-oriented approach to learning and doing.

This is why the learning-by-doing experimentation model called Experimentation Works (EW) was born. This initiative combined the creation and broad dissemination of a series of modules and other supportive tools and resources with a unique “experimenting in the open” approach. EW builds public servants’ capacity in experimentation skills and practice through a unique learning-by-doing model designed to support and showcase 5 small-scale experiments ran by- and for- public servants. By showcasing and supporting department-led experiments from start to finish, EW seeks to build capacity and practical understanding related to the value and process of experimentation, while generating new examples of federal experiments and ensuring open access to related learning modules, progress updates and results for broad impact.

EW has have four distinct phases, as briefly described below:
(1) SETUP phase is where the validation and formalization of partnerships (i.e. participating departments) and any
relevant contracting/contribution agreements. This is also where we create and curate training modules and resources with experts internal to government. Finally, the experimentation selection process and creation of EW teams happens during this phase.

(2) EXPERIMENT phase is the onboarding and customization of training for EW teams. The execution of department-run experiments (define, design, run and evaluate) with support from the core EW team and the EW experts.

(3) RESULTS phase is all about plain-language results blogging on individual experiments. This happens throughout the experiment phase as well. This phase is also where the high-level reporting on EW process as a whole takes place.

(4) IMPACT phase is where departments conduct a six-month post-mortem (e.g. blog post) on their EW experiment(s) and publicly share what they learned, what changes they may be making based on the results of this experiment (e.g. follow-up experiment, invest in building more internal capacity) and impacts, if any, on decision-making.

What does success look like?

●Showcase and support concrete experiments to illustrate what experimentation is, what it takes to run an experiment, the value of
experimentation, and Canada’s commitment to an experimental and evidence-based government.

● Provide hands-on training to a specific cohort of public servants through a
process that will support taking action, problem-driven and rigorous
experimentation, learning by doing, partnerships, and open government.

● Provide open-access training to all public servants through the development of
learning modules on the experimental process available to everyone.

● Build networks of capacity across the federal government by developing a
cohort of public servants who would gain practical experimentation experience,
taking inspiration from other cohort development models

Innovation Description

What Makes Your Project Innovative?

The Experimentation Works model is an innovation in the experimentation field because:

1) Experimentation Works (EW) is about building Government-of-Canada capacity in experimentation mindset and practice through learning by doing. We see EW as a new and concrete way to help build the experimentation capacity that has not been tried in any government.

2) EW showcase small-scale experiments in the open. By showcasing and supporting department-led experiments from start to finish, EW hopes to show the value and process of experimentation, while generating new examples of federal experiments in the open. We also want to share the process, outcomes and lessons learned as broadly as possible.

3) EW is developed as a cohort. Cohorts learn better, as it is all about relationships. Participants grow together and build the knowledge of what it takes to go from the beginning to the end of an experiment.

What is the current status of your innovation?

As of this date of submission in 2018, Experimentation Works (EW) is in its first cohort. We are now at the experimentation stage which means that the project teams from each department are running their experiments (i.e. defining, designing, running, and evaluating). As this happens, the EW cohort is receiving customized support and training to run their experiments.

Further, they are beginning to share that information within government and to the public (design decisions, how the experiments are going, challenges etc.) through our Medium blog.

Innovation Development

Collaborations & Partnerships

Experimentation Works (EW) is about building Government-of-Canada capacity. We partner with departments and agencies across the Government of Canada to run (project teams) and to support the experiments (EW experts). We currently have 4 departments offering experts and 4 departments running 5 experiments.

The initiative is governed by an interdepartmental Assistant Deputy Minister Committee on Experimentation which consists of senior management across 16 different departments and agencies.

Users, Stakeholders & Beneficiaries

Participation in EW provides participating departmental project teams with a number of benefits, including:

- Access to a range of learning materials and expert advice
- Access to a cohort of other public servants going through the same experimentation process
- Exposure to senior management and to the public through blogging about experiences

As part of our commitment to open by default, materials, events and lessons learned are available widely for public benefit (e.g. other governments).

Innovation Reflections

Results, Outcomes & Impacts

We are in the process of launching an internal review of the initiative. This review will focus on three main questions:

1. To what extent did this initiative enhance experimentation capacity in participating departments?
2. To what extent did the EW core team contribute to the successful implementation of the experiments?
3. How appropriate is the governance structure for the EW initiative to support experiments?

Notably, the subsequent phases of Experimentation Works are focused on results, outcomes and impacts.

Phase III: Results
Focuses on the results of each individual experiment through different fora (e.g. blog). This phase includes this review led by the Internal Audit and Evaluation Bureau.

Phase IV: Impact
A post-mortem from each participating department would be the main focus on this phase. The results, contribution made, the lesson learned and the key challenges would be shared on one or more of the public platforms.

Challenges and Failures

One of our initial assumptions was that teams from departments across the government would be fully willing and able to fill out our proposal template. However, we found that many teams, even those very interested in running an experiment, did not have enough capacity to even answer the questions we laid out in our template in a way that would allow us to properly assess their proposals. While participant teams are clearly subject matter experts, they had less grounding in experimentation than we anticipated. We were required to provide a lot of early level support.

Another major challenge was that one of the project teams dropped out. Their agile approach to the project was at odds with the EW frame to run a thoughtful and rigorous experiment. This was a key lesson and critical insight about the relationship between agile and experimentation. We believe this was an important failure and lesson to capture in the open. We have an upcoming blog post about this experience.

Conditions for Success

Strong support from leadership was an enabler for all the organizations involved. As the experiments developed, this was a helpful foundation but we relied less heavily on this the projects progressed. Formalizing the partnerships with the different organizations involved strengthened the interdepartmental approach and provided the necessary clarification on roles and expectations. The cohort model also helped create a de-facto community of practice in experimentation.

Replication

The Experimentation Works model is ripe for replication. Factors that make this possible include:

- The belief that rigorous evidence should inform decision-making
- A need to build experimentation capacity in government
- The ability to leverage internal resources and experts
- The interest domestically and abroad to learn from and apply this model

However, to be sure a review was undertaken following a request from the Experimentation Works (EW) initiative in order to report on the effectiveness and replicability of this initiative by December 2018.

The Internal Audit and Evaluation Bureau will review the relevance and the effectiveness of EW initiative, its value to departments and its replicability. Consequently, this research will review the causation (e.g. how this initiative causes the change) and attribution (e.g. whether observed changes can be attributed to this initiative or where caused by other things) of the EW initiative.

Lessons Learned

As we started planning for EW we had a number of ideas and assumptions about how it would work. And, to be frank, we had a number of fears as well (which, really, are just negative assumptions). This part of the EW process has caused us to reflect and question many of the ideas that underpin the project, and develop some next steps if / when we run a future EW, or similar projects like it.

Lessons for the future

1) In near-term, it is important to bring the experts in very early, forego or delay the complete application process, and instead work closely with teams to get their subject-matter expertise translated into ‘experimentable’ questions.

2) Implications from lesson above are that the role of experts becomes more important than previously thought (and we already thought they were pretty important!).

3) We seem to need our experts more and earlier than we had imagined. In the medium-to-long term, there might be a possibility to return to the ‘purer’ up-front application system if departments’ experimentation capacity increases across-the-board and we again fear that our expert-to-department ratio would be imperiled.

4) Finally, build in flexibility into the model so that projects that do not end up being experimental by design or cannot meet the timelines can still participate in the initiative. Those products, outcomes and lessons learned should equally be apart of the process as the projects that succeed in producing experiments.

Anything Else?

The Experiments

Health Canada Experiment: Improving Consumer Incident Reporting is an experiment on the incident reporting website to better understand if changes to the page can increase the number of Canadians that fill out the report form.

Canadian Heritage Experiment: Paul Yuzyk Award for Multiculturalism allows young Canadians to apply for micro-grants to support projects that advance diversity and inclusion. This experiment seeks to understand the impact of the grant and the potential to scale.

Natural Resources Canada Experiment (1): EnerGuide Label for Homes is an experiment to better understand if the EnerGuide label effectively conveys energy efficiency information to homeowners

Natural Resources Canada (2) EnerGuide Label Experiment is an experiment to better understand if the EnerGuide label effectively conveys energy efficiency information to homeowners

Canada Digital Service Experiment: Rescheduling Citizenship Exams is a project aimed at developing a new and agile approach.

Year: 2018
Level of Government: National/Federal government

Status:

  • Implementation - making the innovation happen

Innovation provided by:

Date Published:

28 January 2018

Join our community:

It only takes a few minutes to complete the form and share your project.