What is impact measurement? Four essential steps

Impact Measurement tool

Impact measurement is an evidence-based research process used by organisations to understand their long-term, positive and sustainable impact.

It’s increasingly recognised and applied by senior leadership teams to support growth, raise funds, improve decision-making, client outcomes, programs development and align stakeholders – including teams, partners and customers.

Importantly, impact measurement is applied to a range of settings or scenarios where purpose-driven organisations are seeking to make a difference. C-suite leaders, boards, investors, and senior executives may seek to measure their impact on individuals, clients, customers and communities.

For example, Mission Australia says it uses impact measurement to understand how its services improve the lives of clients and the communities where the organisation works. It allows them to “maximise client outcomes and improve our programs for existing and future clients.”

So how do you measure impact? There are many models and frameworks which emerged from two predominant fields.

Firstly, the field of monitoring and evaluation, or M&E, has a long history and strong adoption in many fields, particularly international development.

More recently social impact measurement approaches are developing that are closer to the market-oriented needs and practices of social enterprises and impact investing organisations.

In this article, we’ll look at four essential steps on your impact measurement journey.

1. Ask the right questions

As with traditional research approaches, a well formulated research question is essential.

At ImpactInstitute we use the PICO framework, which is widely used by evidence-based practitioners [Schardt et al 2007] and can be adapted for different sectors:

P – Patient, problem or population

I – Intervention (eg program)

C – Comparison, control or comparator

O – Outcome(s)

Begin with this approach, asking yourself about the specific problem to be addressed, what intervention is most likely needed or is being evaluated, relevant points of comparison and the anticipated or required outcomes.

Each of these elements help to develop and refine the impact measurement approach.

2. Choose the timeframe for measurement

Outcomes can be measured over the short, medium, or long-term.

For the purposes of an evidence-based program, most organisations measure short or possibly medium-term outcomes. These are the ones that are more likely to be directly related to the program.

Long-term outcomes, otherwise termed, ‘impacts’, are often viewed as aspirational and are generally more difficult to measure.

However, governments and other investors and funders, increasingly want to know that their investment is creating long-term sustained change.

3. Consider traditional versus practical approaches to impact measurement

So, how do we approach developing a strong evidence base to inform impact framework development and impact measurement implementation? Do we have to sacrifice quality?

The relevance of research findings depends on its quality. The National Health & Medical Research (NHMRC) hierarchy of ‘levels of evidence’, rates systematic reviews and randomised controlled trials (RCTs) at the highest levels (reference below).

We make good use of the findings of these traditional research studies to build the evidence base for the work we do.

However, conducting these types of studies is a time consuming, resource intense process, requiring specialised expertise with a long time to actionable results. It is impractical for most small to medium NGOs to use traditional research designs in impact measurement.

Mixed methods research uses both quantitative (numerical data) and qualitative (non-numerical) research designs. A variety of other research methods, quasi-experimental [World Bank 2010], rapid [RReal] or lean data [Acumen] approaches are increasingly being used in evaluation of outcomes and impact.

4. Collect and utilise data to drive learning and improvement

Although we may be good at collecting data, we have not been nearly as good at creating meaning from data collected in the course of social service delivery. There is an abundance of data held by governments and external providers that is never used by anyone [Tomkinson 2016].

Research must be robust and academically sound, whilst also being practical and agile. It must be designed with a continuous learning cycle in mind, where data is generated, analysed, and reported, in a time-relevant way to feed into the strategy, operations and storytelling of organisations.

This requires careful design so that data collection is embedded in the routine activities of organisations. Regular reporting back to staff and stakeholders, including clients, reinforces the usefulness of their efforts in collecting and providing the data.

Ethics, data privacy and governance are critical and perhaps not as well governed in the social sector as in medical health and research settings.

Our approach

At ImpactInstitute, we use a combination of research methodologies, including primary and secondary research approaches to inform our work with clients:

  • Qualitative, quantitative and mixed methods research designs
  • Secondary research including cost-benefit and cost-effectiveness analyses, scholarly literature reviews and careful searches for existing data sources
  • Best practice stakeholder engagement including design thinking workshops, co-design, focus groups and one on one interviews with stakeholders.

Our research is conducted with practical issues front and centre to ensure sustainability for the organisations we work with, including translating research findings into a clear and practical path forward and considering ethical, privacy and data governance regulations and principles.

What’s your approach – and how can we help? Explore our impact services here.

 

References:

Gertler, Paul J., et al., Impact Evaluation in Practice, World Bank, Washington, D.C., 2010, pp. 81–116. See http://siteresources.worldbank.org/EXTHDOFFICE/Resources/5485726-1295455628620/Impact_Evaluation_in_Practice.pdf/

Lean Data Acumen Academy URL https://acumen.org/lean-data/ accessed 5/10/2020

NHMRC Levels of Evidence and Grades for Recommendations for developers of guidelines December 2009. Accessed 30/9/2021 from https://www.nhmrc.gov.au/sites/default/files/images/NHMRC%20Levels%20and%20Grades%20(2009).pdf.

Rapid Research Evaluation and Appraisal Lab (RREAL), University College London. Accessed 5/10/2020 from https://www.rapidresearchandevaluation.com/

Schardt et al. Utilization of the PICO framework to improve searching PubMed for clinical Questions. BMC Medical Informatics & Decision Making. 2007, 7: 16.

Tomkinson E. Does Outcomes-Based Reporting Contribute to or Contradict the Realisation of Social Outcomes? The three sector solution. 2016:185-214.


Written by: Michelle Jack.

For more information visit: Impact AdvisoryResearch & Data