How to track impact in meaningful (and manageable) ways

This blog post has been written by Marta Arranz, an independent consultant and Evaluation, Learning and Strategy facilitator, and Iva Petkovic, the Senior Communications and Learning Manager at the Alliance for Gender Equality in Europe.
Three people sit together leaning into and looking at the screen of a laptop.

In May and June, the Alliance for Gender Equality in Europe hosted two interactive webinars on Monitoring, Evaluation and Learning (MEL) for our grantee partners. While the webinars were tailored to the needs of our grantees, the reflections that surfaced are relevant far beyond our community of partners. Whether you are a grassroots organisation or a funder rethinking how to better support social change organisations, the main lessons are widely applicable.

Before the webinars, we asked our partners about their current practices. Seventy-seven percent of those who responded told us they already had some form of MEL system in place but most were looking for more guidance with the nuances: how to track systemic change without getting lost in spreadsheets, how to capture shifts in attitudes and behaviours that are subtle and gradual, and how to make MEL something that genuinely serves the team instead of becoming a burdensome requirement.

Start with the ‘Why’

We began our sessions with a simple but often overlooked question: why do we track impact at all? In many cases, organisations default to what is easiest to count, such as the number of workshop participants or volume of policy papers published. While these outputs may contribute to change, presenting them as indicators of impact rarely reflects the depth and complexity of the change organisations are trying to achieve. They also fail to capture insights that can help a team learn from experience and act more strategically.

One of the reasons why tracking impact feels difficult for many small grassroots organisations is that donor-driven frameworks often feel disconnected from the realities of working in fast-paced, under-resourced environments. When monitoring and evaluation becomes a once-a-year scramble to fill out a long report, it pulls focus away from the actual work, whether that work is supporting survivors of violence, advocating for reproductive rights, or organising classes for refugees to gain IT skills. When you consider that most organisations report to not one but multiple funders, each with their own reporting timelines and templates, indicators and logframes, it reveals the heavy strain most frontline groups are under.

Yet, when MEL is reimagined as a tool for strategic learning and adaptation, it can empower teams to make better decisions about where an organisation should invest its time, energy, and budget to most effectively transform lives.

Define what success looks like

Learning starts when we ask the right questions. One of the most meaningful is: what does success look like? The answer will vary depending on the context, whose perspectives are included, and will likely evolve over time. For one organisation in the UK, for example, success means that women in the country live free from domestic violence and feel safer in their daily lives.

From there, an organisation can consider how it plans to achieve its vision. For the mentioned organisation, that involves a range of strategies and tactics, from advocacy for legal reform to providing legal aid for victims of abuse to peer counselling sessions.

Alongside this, organisations should articulate a theory of change, i.e. a clear explanation of how change is expected to happen. For instance, the UK nonprofit might put it as: if women have economic independence and strong peer support networks, then they are more likely to escape violent relationships and stay safe. Putting this logic into words, ideally with input from the people most affected, helps teams choose the right tactics and stay on track.

Monitoring and evaluation can then support teams in understanding what is working, what needs to improve, and how programmes can be adapted to better serve communities.

Decide what you want to know

One common mistake organisations often make is jumping straight from strategy and desired results into a long list of indicators, bypassing the most important step: identifying their learning needs. Organisations can start by asking themselves, what do we really need to understand in order make better decisions?

The next step in an organisation’s impact measurement and learning journey is to identify who are its most important stakeholders and what they need to know. Different stakeholders are responsible for different decisions and have different learning needs. A board member might want to know whether the organisation has the right skills and resources to deliver on its mission. A funder might care more about whether a specific intervention is effective. A community member is interested in the quality of the received service and how their feedback will be used.

An important part of the process of designing MEL systems, which are both relevant and right-sized, includes surfacing those different needs and prioritising the data they need to address them.

Keep it manageable

When people hear “MEL system,” they often imagine complex spreadsheets and sophisticated statistical programs. But a simple plan that outlines what will be tracked, how information will be collected, how often, and by whom is enough.

Starting too big and then abandoning a system that is too difficult to sustain is another common pitfall. Collect only the data that your organisation has capacity to use in order to answer the learning questions above. If it won’t inform future decisions, it may not be worth collecting at all.

In most cases, combining qualitative and quantitative methods is a good way to track impact, such as determining the percent of participants who have increased their income after the programme as well as capturing testimonies about how confident and optimistic they feel about the future. There is a long-standing belief that qualitative methods are less rigorous. However, there is enough evidence that, in fact, in some cases, qualitative data is more suitable because it can capture progress that is usually difficult to see because it is subtle and gradual.

The way research methods are applied it is also crucial. Depending on how it is done, data gathering can be extractive, or it can also be positive and empowering. If done well, evaluations can allow participants to feel heard, reflect on their experiences, and even find healing in the process. There are multiple design choices that can me made to ensure it is not an extractive experience, including making the intention and purpose visible, the framing, the role that participants play in the evaluation and the choices they are provided with, how the interviews or group discussions are facilitated, how the data is treated, and how it is beneficial to the participants.

A vital question remains: Where can data come from? Even the most thoughtfully designed MEL plan won’t make a difference unless it is integrated into the daily life and internal ways of working of an organisation. That means carving out time and space for collection and reflection, in both formal and informal ways. It could be a team debrief after a workshop, a journal entry, a conversation over coffee, survey with only a few questions, or a structured annual strategy review. By asking powerful questions, such as what trends are emerging and how can we explain them, will also reveal context and lived experiences – and requires critical thinking to make sense of it all.

Impact measurement and learning should not require you to hire a consultant to do a big external evaluation – an expense that can be difficult to cover for small grassroots organisations. The important thing is to make embed a culture of enquiry and learning and build habits that support it.

Another common challenge is working in “islands”, where different components of the MEL system (data, strategy, lived experiences of team members, etc.) are disconnected. A well-integrated MEL system should help weave these threads together into a coherent picture of what is working, and what can still be improved.

All this creates space to notice the oftentimes messy and unexpected outcomes. Complex social change is rarely predictable or linear. For the UK organisation, the most significant outcome was not the legal reform they originally planned to achieve, but rather a shift in how hopeful survivors felt about their futures. A person rediscovering their agency is progress and it does matter. Yet, it will often go undetected by rigid systems focused only on outputs and numbers.

Communicate your impact

Finally, a core function of MEL is to help organisations tell the story of their work, not just to funders, but to themselves, their communities, and potential allies. The continuous process of measuring impact can help a team beautifully combine facts and stories, percentages and personal testimonies, statistics and lived experiences.

Of course, donors and boards still need to understand how progress is unfolding. The challenge for many organisations lies then in expanding their understanding of what meaningful change looks like, to help others see that valid evidence can take many forms, and to collectively celebrate subtle and incremental shifts.