I hate the word ‘learning’

[Word cloud from http://voulagkatzidou.files.wordpress.com/2011/02/wordle.jpg]

I didn’t say it but at a meeting of several foundations this week convened by GEO I did say Amen when another foundation “learning” officer confessed it. I also have the word learning in my new title (along with knowledge and evaluation) and we are not being hypocrites or non-believers but it has been very frustrating to see foundations embrace and promote “learning” as the alternative to evaluation. It is partly understandable when in the past evaluations and evaluators have produced data and reports that have not included clear analysis or actionable knowledge. But I do not see how anyone can learn without data and evaluation, and we should not be measuring and evaluating without clear goals for decisionmaking and action.

Evaluators have struggled to put on a friendlier face of evaluation by emphasizing its contributions to learning and sometimes by de-emphasizing the measuring, monitoring, compliance, and judgement aspects of evaluation. However, I continue to feel more strongly that evaluation must be about learning and accountability. We must be accountable not only to the results we intend and promise to communities but we must also learn in an accountable way. Learning in and by foundations can be very selfish and self-serving if it results only in mildly more knowledgeable program officers that do not change or adapt their ideas and strategies. Learning that is not based on data and analyzed experience is what? Intuition? Hunch? Fond memories of an interesting grantmaking experience paid for with the public’s money held in trust? And learning that does not contribute to actionable knowledge, decisionmaking, and improvement is a waste of data collection and analysis efforts.

As I have tried to help foundations and foundation staff focus or define intentional learning goals it also became clear that individual or even group learning goals can be as self-serving as the learning experience itself. What we are interested in. What we would like to know. What would be interesting to find out. As helpful as these components might be to considerations for future strategy development, they still do not offer a clear path to making knowledge actionable, and more importantly, helping the group agree on a shared path forward.

It has been both focusing and freeing to identify and name the decisions that need to be made by foundation staff and concentrate the data and learning agenda to provide the information necessary to make those decisions. The learning agenda needs to support the decisions that lead to actions–continue this work, adapt what we are doing, change the strategy entirely, and even end funding. And learning can be focused on what we need to know and the information we need to have to influence the decisions of others. But all of this requires being explicit and intentional early about the decisions we need to make and the target audiences we intend to influence before planning and embarking on an evaluation and learning agenda.

Otherwise all our learning efforts will result only in interesting trivia used in cocktail party chatter.

    Related Resources

Marilyn Darling and Fourth Quadrant Partners Emergent Learning framework is one tool that I have used to help groups get to the action that needs to result from the learning. And at the recent American Evaluation Association meeting it was refreshing to see multiple presenters use the experiential learning trio of questions asking of data/analysis: What? So what? Now what?

Leave a Reply

Your email address will not be published. Required fields are marked *