PRESS RELEASE
Project Evident Publishes the Funding the Future: Grantmakers Strategies in AI Investment Report

The report from Project Evident examines grantmakers’ AI funding strategies and practices and provides insights for both funders and practitioners
Today, Project Evident published Funding the Future: Grantmakers Strategies in AI Investment, a new report that looks at how philanthropic funders are approaching requests to fund the use of AI.
Made possible by Schmidt Sciences, Google.org, and Charles and Lynn Schusterman Family Philanthropies, this report, through surveys and interviews with funders of many types, found varying levels of funding, engagement, and confidence in assessing the technical feasibility and ethical impacts of AI initiatives. However, there was common recognition of AI’s importance and the tension between the need to learn more and to act quickly to meet the pace of innovation, adoption, and use of AI tools.
This research builds on the work of a February 2024 Project Evident and Stanford Institute for Human-Centered Artificial Intelligence working paper, Inspiring Action: Identifying the Social Sector AI Opportunity Gap. That paper reported that more practitioners than funders (by over a third) claimed their organization utilized AI.
“From our earlier research, as well as in conversations with funders and nonprofits, it’s clear there’s a mismatch in the understanding and desire for AI tools and the funding of AI tools,” said Sarah Di Troia, Managing Director of Project Evident’s OutcomesAI practice and author of the report. “Grantmakers have an opportunity to quickly upskill their understanding – to help nonprofits improve their efficiency and impact, of course, but especially to shape the role of AI in civil society.”
The report offers a number of recommendations to the philanthropic sector. For example, funders and practitioners should ensure that community voice is included in the implementation of new AI initiatives to build trust and help reduce bias. Grantmakers should consider funding that allows for flexibility and innovation so that the social and education sectors can experiment with approaches. Most importantly, funders should increase their capacity and confidence in assessing AI implementation requests along both technical and ethical criteria.
“In my experience, when funders explore new domains or new practices, they move much faster when they learn together and frequently share what works and what doesn’t. I think this is true in AI, as well,” said Michael Belinsky, Director at the AI Institute at Schmidt Sciences.
The report identifies areas for collaboration that would help funders accelerate AI grantmaking. There is an opportunity to create shared networks for learning, as well as common frameworks, practices, and tools to help funders increase their capacity to assess requests to support AI implementation. The report includes a sample tool – an investment rubric that offers potential questions and assessment considerations in evaluating an AI implementation request.
An Executive Summary and the AI Granting Rubric are available in Resources.