Practitioners at the Center: A Decade of Learning About What Durable Evidence Infrastructure Really Requires
There is a version of evidence building that starts with the tool. You walk in with a logic model template, a survey instrument, or a data platform, and you begin fitting the organization to the framework. It is efficient. It is legible. And it often doesn’t last — because it does not start with the needs of the practitioner, the person using the evidence. To build durable infrastructure, the practitioner has to come first.
A Commitment from the Start
At Project Evident, we are unapologetically practitioner-centric. Practitioners are the leaders and program staff at nonprofits, school districts, and public agencies who deliver services and drive outcomes every day. They sit at the intersection of program and participant, and they have an irreplaceable view of what is working, for whom, and why. Practitioner-centric evidence building means that practitioners set the learning agenda and drive their own evidence building. It means that evidence is built to serve their decision-making first, and reporting requirements second. And it means that the communities they serve are engaged throughout the process, not just measured at the end.
This principle is so important to us that it is central to our mission:
“We believe that by empowering practitioners to drive their own evidence building while also strengthening the surrounding ecosystem, we can increase the number of effective solutions in the social and education sectors and scale them faster — ultimately producing stronger, more meaningful, and more equitable outcomes for communities.”
Strengthening Practitioner-Centric Evidence Building Means Building Durable Infrastructure
As we discussed in our recent piece on what evidence infrastructure actually requires: it is not a report, a dashboard, or a theory-of-change diagram. It is the integrated system of people, processes, tools, and workflows — spanning governance, data architecture, analytic capability, decision routines, and capital alignment — that is designed to generate, prioritize, and use evidence consistently over time, in ways that are relevant to the specific context of the organization using it. Think of it as the R&D engine for impact: the underlying architecture that makes improvement sustainable, and that keeps an organization learning even as everything around it changes.
And today, everything around practitioners is changing. The past several years have brought enormous disruption to the social and education sectors — funding uncertainty, leadership transitions, stretched teams, and the relentless pressure to do more with less. When talent turns over, organizational knowledge walks out the door with it. When teams are stretched thin, curiosity can be the first casualty. The careful, ongoing work of asking what is working and for whom risks getting crowded out by the urgent work of just keeping programs running. This is precisely why durability matters so much to us. Durable infrastructure is right-sized, strategic, and sustainable. Durable infrastructure means that when a program director leaves after five years, the evidence system they built keeps running. It means that learning is embedded in the organization’s routines and tools, not locked in any one person’s memory or motivation. People move on. Good infrastructure stays.
Building Durable Infrastructure Means Being Practitioner-Centric
Finding out what infrastructure is most useful starts with listening. I mean listening carefully, patiently, and with genuine curiosity. Listening to understand why an organization is doing what it’s doing, what it is trying to learn, and what it would actually do with that learning if it had it. The distinction between building evidence infrastructure for organizations and building it with them is at the heart of what we mean by practitioner-centric evidence. And, we have learned, it is also at the heart of what makes evidence infrastructure durable.
When we begin a new engagement, we ask many questions. What outcomes are you trying to achieve? What do your funders require? What does your data infrastructure look like today? These are useful questions. But the more important ones are harder to ask and harder to answer. Questions like: Why do you want to know whether this program works? What would you do if the answer surprised you? What is at stake? What might change for you, organizationally, financially, politically, depending on what the evidence shows?

Deeply understanding the why of evidence building — not just the what — is what makes it more strategic, more robust, and more sustainable.
The Ecosystem Around the Practitioner Matters Too
Understanding what a practitioner needs is necessary, but it is not sufficient. No practitioner operates in isolation. They exist inside an ecosystem of funders, evaluators, researchers, partner organizations, and most importantly, the communities they serve. Each of these parties brings its own interests and needs to the evidence table.
Funders want accountability and proof of impact. Evaluators want to run rigorous projects that generate publishable, actionable findings. Communities want their voices and lived experiences reflected in what gets measured and how it gets used. These interests overlap with practitioners, but they are not the same. And when they go unacknowledged, they pull in different directions in ways that can quietly undermine even the best designed evidence system.
This is the problem of multiple audiences. Evidence systems built for one audience often fail to serve the others. They generate reporting that travels upward and never comes back. They answer questions that funders ask rather than questions practitioners need answered. Sometimes they simply measure outputs that are easy to count rather than outcomes that are hard to reach.
The answer is not to abandon the interests of any one party. It is to surface all of them and to design an evidence system that genuinely seeks to align a practitioner’s need to learn, the funder’s need for accountability, and the community’s need to be centered, so that they can all be met by the same body of evidence. This alignment is not a one-time exercise at project launch. It is an ongoing practice. And it requires a trusted partner who can hold all of these interests at once and help navigate the tensions between them.
Bringing It All
In a piece I wrote a few years ago, I reflected on what past lives in hospitality, retail, and mediation have taught my colleagues and me about doing this work well. The takeaways across all of them were the same: care about how people feel, listen more than you talk (I’m still working on that), understand the interests beneath the stated positions, and be willing to take a step back and reground when something isn’t working.
These are not soft skills bolted onto technical work. They are the conditions under which technical work becomes useful. An evidence framework that nobody trusts will not be used. A data system that practitioners feel was imposed on them rather than built with them will be worked around. A funder-facing evaluation without a feedback loop to the people delivering the program will produce findings that sit in a drawer.
We have seen all of these failures. But we have also seen the alternative: evidence systems that practitioners genuinely own, that capture what matters, that generate learning that comes back to the front line in forms people can actually use. The difference, in almost every case, comes back to the quality of the listening at the start — and the quality of the relationships built along the way.

Looking Ahead
This piece is part of a series that starts to articulate what close to a decade of this work has taught us. Our CEO, Kelly Fitzsimmons wrote recently that evidence has never mattered more than it does right now — which is precisely why getting the infrastructure right cannot wait. As we move forward, we will continue to learn and adapt. We will always be practitioner-centric, but the infrastructure we help practitioners build will shift over time. Today’s pressures call for leveraging efficiencies, and we are excited about all of the new tools that are available to the social and education sectors to support evidence building – tools that we share through our Data Hub and OutcomesAI work. But durable evidence infrastructure is not just about better tools and stronger systems (though we have much to look forward to on both counts!). Better, cheaper, faster, and more equitable outcomes are only possible when the human side and the systems side are built together. Building evidence infrastructure is about systems AND people. Getting the human side right is what makes the systems worth building.