Don’t loan me a book. At least, don’t loan me a book unless you’re willing to get it back with pencil scribbles all over it. Just ask my husband.
In the last post, I talked about the often-fumbling search for more meaningful solutions that I think a realization of the need for a Wise Economy forces upon us. Abstract ideas about communities as ecologies and beware-ing of magic bullets and the like is all fine and good, but what do you do with that? How do you make change happen in the places where you live and work?
That last post talked about some baby steps that we can be taking to start to shift toward a Wise Economy, and it talked a lot about the assumptions we make about the Way Things Work and how those Cannot Be Changed, Ever. In another post recently I referenced Thomas Kuhn’s idea of paradigm-breaking: how breakthroughs requires that one somehow learn to see where the false walls are around them and what opportunities might lie beyond.
Within a Wise Economy context, the rubber-meeting-road moment is when we make plans for the future of our communities. Comprehensive plans, strategic plans, action plans, organization plans…whatever we do to set the direction of the organization that we are counting on to make the community’s future happen, that’s where a Wise Economy either begins to take root or falls on the stone and withers. And after many years of making these kinds of plans, it’s clear to me that when our plans fail us, it’s often because our blind spots, our limited assumptions and our overlooked mis-interpretations equipped us with a wrong or faulty plan. We often set ourselves up for that failure because we didn’t know and could not see all the things we were missing.
One of the books that has been most influential on my thinking over the past few years is a 20-year old volume with the catchy title, The Logic of Failure: Recognizing and Avoiding Error in Complex Situations by Dietrich Dorner. I’m going to assume that it sounds more appealing in the original German. Recommended to my husband by a very wise boss, this book details the results of a series of studies examining how people made decisions in complex and ambiguous environments. Complex and ambiguous… sounds nothing like the communities we work with, right? Add to that the fact that the participants were typically dealing with economic development and public policy scenarios, and it starts to hit uneasily close to home. So Dave bought it, but I read it… and found it so insightful that I marked passages on nearly every page. He doesn’t share books with me much anymore.
In some respects, it’s a depressing read. Participants in Dorner’s studies make a lot more mistakes than correct decisions, and much of the time they fail miserably. By studying the participants’ choices and assumptions closely, and doing that a mind-numbing number of times, Dorner does develop a pretty reliable differentiation between those who made consistently good decisions, and those who set themselves up for disaster.
Dorner illustrates a large number of differences in how successful and unsuccessful participants approach and manage the tasks, and I’ll continue to write about those. Here is one that particularly stood out for me:
Both the good and the bad participants proposed with the same frequency hypotheses on what effect higher taxes, say, or an advertising campaign to promote tourism in Greenvale would have. The good participants different from the bad ones, however, in how often they tested their hypotheses. The bad participants failed to do this. For them, to propose a hypothesis was to understand reality; testing that hypothesis was unnecessary. Instead of generating hypotheses, they generated ‘truths’ (p.24)
How often do we test our hypotheses? How often do we assume that a project will have a certain impact without taking a hard look at whether those assumptions are sound? How often do we go back and re-examine the basic assumptions that we built our last plan on? How often in the history of the last 60 years have we as planners and economic developers and administrators and communities generated our own “truth,” expended enormous resources on that truth, and then acted surprised when something hits us that we didn’t see coming?
Admitting that we might not have the truth takes bravery. Taking apart and examining the foundations of the structures we have built feels rightly dicey. But the termites work silently until the structure falls down. There is a kind of vigilance that we have to maintain: anticipating and expecting that the world will change, that we might have gotten something wrong back when we made that plan, that we cannot just finish the plan, sigh with relief and move on to getting it done. We have to regularly test our hypotheses – and be ready to change what we are doing when those tests show us that we are setting ourselves up for trouble.
That is generating wisdom. I’ll take that over “truths” any day.