Becoming a learning organization is a journey, not a destination. The Walton Family Foundation and its Strategy, Learning and Evaluation Department (SLED) have been on that learning journey for several years now - striving constantly to become better at what we do.
When the foundation’s grantmaking helps our partners solve problems, we want to know how and why they succeeded. We also want to understand when we fall short. Lessons learned from failure are just as important.
The foundation is taking another big step in its learning journey this year by updating the evaluation tools we use in our grantmaking process to establish and then track progress our grantees make toward meeting key goals. This redesign of our approach to grant evaluation has two key parts:
- When establishing metrics at the outset, we are focusing more on expected outcomes at the end of a grant, and less on outputs expected to be completed along the way; and letting the underlying theory of change drive metric selection.
- When assessing the extent to which metric targets were met or missed, we are focusing more on the lessons and insights gained from the data, as well as the strategic adaptations that have been made or are under consideration moving forward as a result.
These adjustments are the driving force behind our new evaluation approach. We seek to advance learning that leads to strategic adaptations that are in service of continuous improvement in the foundation’s social and environment impact.
We’re seeking to make a meaningful pivot from an accountability-based approach to a learning-based approach to grant evaluation.
What does this mean in practice?
When the foundation assesses the performance of a grant, we’ll be asking less about traditional deliverables and activities – the actions and tasks completed during the timeframe of a grant. Instead, we’ll be more deliberate and collaborative in how we define what success will look like and how outcomes advance both foundation goals and our grantee’s theory of change.
Simply put, we’re seeking to make a meaningful pivot from an accountability-based approach to a learning-based approach to grant evaluation.
To that end, the foundation has created a new toolkit to capture what was accomplished by our grantees. It will also help us determine lessons learned (the ‘so what’), and how those insights are being applied to both foundation strategy and grantee practice moving forward (the ‘now what’).
Most of our grantees will discuss progress, lessons, and strategic micro-shifts using our new Assessment of Past Performance and Learning (APPLE) report. At its core, the APPLE is a dialogue-based approach where foundation program officers hold structured conversations with grantees to reflect on what went well during the course of a grant, and what can be improved.
These conversations are meant to dig into the work. We want to create a shared understanding of successes and challenges, including those not anticipated in the formal metrics. We also want to make sure we’re incorporating what we learned into our work (potentially including the focus of grant renewals).
The second new tool we’ve developed is a SLED Evaluation and Learning (SEAL) report. We typically use this to assess and capture learnings for larger grants as well as those that present unique learning opportunities that can inform the foundation’s broader strategies.
It’s not just change for the sake of change.
As we developed these new tools, we followed a set of guiding principles. We want our evaluation tools to:
• Be Operationally Useful. The structured learning conversations between foundation staff and grantees should capture key insights that help us make tactical decisions about how to carry out future work.
• Use Meaningful Metrics. Assessments will focus more on metrics that help us understand how grants align with and advance a grantee’s theory of change. They are structured around key goals and hypotheses, giving us greater insight into how we can improve.
• Help Build Capacity. Evaluation tools present an opportunity to build capacity for grantees, helping to strengthen their ability to turn data into insights, and those insights into action.
• Create Accountability to Learn. Accountability is focused on learning and adapting in service of achieving greater impact, rather than on meeting targets.
• Be Collaborative. Grant metric development and post-grant assessment are collaborative processes that help us learn together and emphasizes the foundation’s appreciation for sharing information.
• Improve Relationships. By focusing on learning and collaboration, our approach seeks to further build trust and understanding between the foundation and our grantees.
As the foundation continues its journey to becoming a learning organization – and one that embeds equitable practices more deeply into our work – we must change some fundamental tools and practices. These updates to the evaluation tools we use are an important step. But we have further to go.
And it’s not just change for the sake of change. We’re taking these actions to help students get the education they need to achieve their highest dreams and ambitions. To ensure we have enough clean water and food to sustain healthy communities. And to build communities that are inclusive and vibrant – for all.
Ultimately, our learning journey continues to be in service of fulfilling the foundation’s mission to create opportunity for the people and communities we serve.