(Part One of my conference notes is here.) A new handbook on measuring progress will be launched at the Korea conference. They are also developing a framework on a Taxonomy for Progress, identifying quality frameworks for indicator sets, and collectingand sharing lessons about successful sets of indicators. Why companies aren't measuring business impact: A survey of studies on the business case shows general support for the belief that CI adds value to the firm by enhancing: “It has been very difficult to establish a clear causal relationship between CI and business benefits, and it is challenging to set a dollar value for the intangible assets that CI often creates.” For more information, see: www.bcccc.net
The conference has lived up to my expectations. If you weren't here, you're missing something important.
Friday began with a presentation by Jon Hall, who leads the Global Project on Measuring the Progress of Societies at OECD. I found this presentation (PDF) online, which is quite similar to the one he gave, so until I can get the actual presentation, this should give you a flavor of what we talked about.
Read more by clicking the link below. NEW: I have now added in my own comments from the last panel session.
Their philosophy is straight-forward: How to measure is a technical issue, and we are developing best practices. What to measure is a political issue. We can advise on how to set up a process that is legitimate and reflects the shared values of a society. But the choice belongs to the society.
Four questions for the 3rd World Forum in Korea:
Several themes emerged as they developed the Global Project Framework:
The first breakout session of the morning that I attended was on The Power of Neighborhood Indicator Systems. Kathy Pettit and Tom Kingsley from the National Neighborhood Indicators Partnership and Charlotte Kahn from the Boston Foundation were the presenters.
Kathy began by describing NNIP's work. She promised that her presentation would be up on the NNIP website shortly -- I'll link to it as soon as it is.
NNIP has 33 partners. Each partner has their own network of partners – so each partner is really a contact point that links local networks together with other local networks. They are also working with LISC and MacArthur on the Sustainable Communities Initiative.
Charlotte spoke about the idea of a shared indicator system. It's been a dream since NNIP began in 1995, but at the time it wasn't feasible. Each of the 32 partners has its own community statistical system – which is different from indicators. This is a new opportunity to think harder about these same sets of measures and have deeper conversations among ourselves and across cities and understand patterns and draw conclusions that will be of great interest to policy makers, especially in context of preparing for the 2010 Census. Have a working topical framework – 10 areas and four cross-cutting topics.
Tom asked, How do we operationalize the system? Chris Walker from Sustainable Communities is helping drive this partnership. Have a draft plan, need to raise funds, goal to combine with 2010 Census. ACS at neighborhood-level is only 5-year averages and won't help us get the information we need to understand how distressed neighborhoods are reacting to the current crisis.
Charlotte pointed out that their are some language/definition barriers between community statistical systems and community indicator systems: indicators are not the same as data variables, but are closely related. Data variables provide information for indicators. For example, the percent of the population receiving TANF assistance is a variable, but it is also data that supports an indicator of family poverty/economic hardship.
Building a shared/common neighborhood-level indicator set across metro areas raises questions. How do you define "neighborhood"? The partnership will likely use census tracts as common framework, but each partner would then define their own neighborhoods. A key component is to have no rankings and no labels for neighborhoods, but instead to use the data to tell good stories about distressed neighborhoods. The challenge is that administrative data is often deficit-oriented, when sometimes what we need are indicators of neighborhood assets. The Urban Institute has also been sponsoring an effort to create a framework for developing art and culture indicators -- very different kinds of indicators from property-lvel data systems.
A challenge is to develop a robust set of national data policies that address the timeliness, accessibility, and compaarability of data from very local through national levels. The Annie E. Casey foundations has developed a white paper on data availability for legislative/adminstrative advocacy – for example, great strides in data could be available if we could double the sample size of the American Community Survey. We need a joint NNIP/CIC federal advocacy group, addressing data immediacy, availability, and standardization (for example, high school graduation rates, and data availability by student residence instead of school attended.)
I then attended an odd session on Measuring the business value of corporate community initiatives. Jane Coen, from Underwriters Laboratory, Maureen Hart, from Sustainable Measures, and Vesela Veleva, from the Boston College Center for Corporate Citizenship explained a new project of the Center for Corporate Citizenship. The project is completed and will be launched in March or April 2010. The Center has been around for 25 years, and works with companies (350 right now, including 50% of the Fortune 100 companies) and they provide corporate seminars, executive education training, and whatnot for them.
They wanted to develop a practical framework, guidelines, and tools to assist companies in measuring the business value of Community Involvement initiatives. The purpose is to be able to show the business impact of involvement (why it pays to support community causes through employee volunteer time or corporate sponsorship), even in tough times.
Right now companies are measuring inputs and outputs, and not the impacts. Many companies measure “employee involvement” in CI programs but not actual “employee engagement” (there is confusion about what that means).
Why participants want to measure business impact:
reputation/standing within the community
Draft framework:
www.bcccc.net/index.cfm?pageId=2025#impact
Questions from the audience included: GRI has 79 performance measurements, but this one will only look at business impact of CI initiatives. At this point not looking at the environment. Easier to measure business impact of environmental initiatives.
How are you recommending that businesses identify their CI initiatives? Are you building into the framework guidance on how to select programs that are strategic to corporate mission? (No, companies will choose their initiatives on their own.)
What is the real connection between community indicator projects in trying to improve the community and what the other actors (in this case, the companies operating within the communities) are trying to do in getting involved in the community? (For this project, not a whole lot, except to show that Measuring Things is Hard. The framework they're developing will not result in increased data for community indicators projects or increased corporate support for community indicators projects, but it will help companies see how much money they make off of community involvement activities as stealth marketing and reputation-enhancement exercises. I told you the session felt a little odd. Somebody else who was there help me out with the conclusions I was drawing -- what did I miss?)
Andrea Whitsett, from Arizona State University, moderated a session on measuring the impact of indicator systems. On the panel were Jon Hall, from OECD; Viki Sonntag, from Sustainable Seattle; Dan Duncan, from United Way in Tuscon and Southern Arizona; and Ben Warner (that's me!), from the Jacksonville Community Council Inc. (JCCI). Because these are my notes, I didn't speak and write at the same time, so my comments are missing. I will fill them in shortly when I think of witty things I should have said. ETA: Added my notes in now of what I meant to say. I didn't add the stories in -- that will have to wait for a different opportunity.
Andrea began by explaining her work at the Arizona Indicators Project, which began in 2007, from the office of the president of the university. Because the project grew so quickly and so organically, there weren't a lot of things in place to measure the impact of the indicators project.
The panelists identified themselves, then Andrea asked us all to define how we measured impact.
Dan: The national organization set national goals around education, income, and health, and put out the call to local organizations to align with these goals and to see how we could work together collectively. We began a process to re-engineer our impacts using the results-based accountability model and focus on turning the curves. We brought together a large committee, trained them in results-based accountability and asset-based community development, and charged them to be purposeful in identifying what citizens could do, what citizens could do with help, and what only institutions could do. We identified key trends we wanted to work on, including the high school graduation rate and early learning as measured by third-grade learning scores. Around income, we're looking at having more families having more than 200% of the poverty level. Around health, we have a lot of seniors, and we want to keep them healthy and independent, and are measuring nursing home population growth compared to the growth in the overall population.
Viki started from a historical context and the expected impacts they wanted to see. Sustainable Seattle was one of the early citizen groups who wanted to develop community indicators. Our intent was to engage a cross-section of the community in determining the indicators. A result we expectedwas increased citizen involvement in understanding and using the indicattors, and increased responsibility in institutions and policy makers in addressing the indicators we had identified. The assumptions of what indicators could do led to a change in the organization – indicators themselves don't drive change, so we instead began doing neighborhood-level work. Measures of success was how engaged the community was in that process and what actions they took as a result of being involved. We are also interested in data democracy, which means we may be looking at capacity building in obtaining and using information. Our focus is very grass-roots, very community-based social change efforts. How we arrive at that is probably in a discussion with the community to discover if what we provide is serving the community – it's more of an informal process right now.
Jon explained that the global project began because there was a great deal of work occurring but no one to take the lead and support the effort. We're looking at working with leadership and policy makers, and of growing a network of networks. Trying to change how democracies work and change how decisions are made so that people talk about the evidence.
Ben: We measure impact of our indicators by first identifying what we intend their impact to be, based on our model of community change.
1. Indicators serve to measure how well the community is progressing in relation to a shared/articulated community vision, and provide the knowledge necessary to forge shared community priorities, inform planning process and strategic actions, and create data-driven decision-making. In order to measure their impact, we need to measure their explicit use in institutionalized priority-setting and decision-making, their implicit use in the same arenas, their function is shifting community conversations and setting a community agenda (through both media mentions and public reference to the indicators in community settings), their use in activities (actions taken or legislation passed) to implement those decisions, and the building of cross-sectoral collaborative partnerships to address the priorities identified in the indicators. In short, we measure (through an annual self-evaluation report) the effectiveness of the indicators in identifying needs and how often they are used in the community in determining how to address those needs. This also includes a survey of our identified key decision-makers in the usefulness of the indicators and an opportunity for them to share with us how they used the information and their suggestions for improving how we report the indicators to better meet their needs.
2. The same indicators are also used to measure the effectiveness of the decisions made and actions taken. The indicators are a key assessment tool to see if conditions have changed, whether the trend line is bending, whether the programs/projects/legislation/activities/allocation processes have addressed the underlying concerns that drove the community priority-setting in the first place. In other words, if your indicators project is effective in your community, the trend lines should show improvement.
Andrea then asked, once you identify the impact you want to have, how do you measure (quantitatively and qualitatively) the impacts of your indicators.
Dan: For the United Way, the five indicators drive all of our activities. Those are large issues, but they are also the indicators that can bring the community together. That's why we are developing performance measures around the strategies we use to affect the community indicators. They allow us to ask, “Are we doing things right, and are we doing the right things?”
We have an online system to track those performance measures and track them on a quarterly basis. Tis helps us bring people around the table to track these issues
and refocus our efforts where necessary.
Viki: Our impacts have pretty much been indirect effects and indirect impacts. The performance measure efforts were an uptake and an outgrowth of the community indicators process. Because we're relatively new, we're working on developing the quality of our systems internally. On the participatory side, one measure of our impact is to look at whether our participation is representative of the larger community. This way we can measure our participatory processes and their impacts.
Jon: Evidence-base policy doesn't exist, at best it's evidence-influenced. Hard to quantify even the influence of GDP on policy. The process and conversation may be more important than the indicators themselves. The hardest distance is the last two inches from people's eyes to their brain. In Australia when we first put these progress measures out I asked the policy makers why they wanted the information. Having these sets of measures is helpful to focus on the things that matter. They can help shape the debate and influence things in that way.
Andrea: To whom are you accountable?
Ben: First and foremost, we're accountable to ourselves. Our continued existence and role in the community is entirely dependent on our integrity and honesty -- integrity can only be sold once. We have a rigorous Credo we test ourselves against to ensure all of our actions are ethical. We are accountable to the community we serve. They expect us to be a neutral, trusted community convener and we need to honor that expectation by providing quality information that responds to their values and visions for the community and needs. We need to be able to serve as the conduit for the voices and stories of people who don't have other ways to be heard. We are accountable to our funders to ensure that we follow through with what we promise to deliver. By maintaining the community's trust, we can ensure our relevance and effectiveness in helping create community change.
Dan: We used to be just accountable to our donors every year. Now with the initiatives we're doing, we're not just the data arm but the organizing table around these issues, and we're accountable to the people that we're trying to help. The key is to keep the focus on the broader issue and create the civic and political will to make the hard choices for the greater good.
Viki: Accountability to the community at large. But what they are holding us accountable to is whether the information has meaning to them and whether it has relevance. This was demonstrated to us when we were working at the neighborhood level and the information we were providing was valid and reliable but it wasn't meaningful to them, and so they disengaged with us. The good news is that the neighborhoods have connected together to develop new partnerships. The challenge we face is to reconnect with them.
Jon: All must be accountable to the community, must be nonpartisan and honest and clear. If you lose the perception of objectivity and appear to be biased, they won't look at the data any more. Newfoundland: before you put the data out, we used to sit and argue over the figures; now we can argue over the issues.
Andrea: How do you demonstrate community impact in order to make a compelling case for funding?
Ben: We use our measurements of effectiveness -- how many people use our work, how often they use the work, in what ways they use the work, etc. In addition, we've compiled a Highlights of Change document that shows over time how critical our work has been in making the community better. For some of our funders, we point to how they use the work to justify its continued funding -- institutionalizing the use of the indicators in their decision-making processes helps support long-term financial support. We point to how others use the indicators to assist them in addressing the same priorities. For others who sponsor our work, we point to how the major community decision-makers use the work, and offer an opportunity for them to put their name on the document in support of that work for the entire community to see.
Dan: We're responsible for the work and the fundraising together. In today's world, donors don't like the middle person, soo we have to talk about the accountability of the United Way system and what we are achieving. To a large part, we're selling aspirations, and we're finding donors and grantors who care about the things we care about. We look for what they care about and try to match their passion and their intent – what the grantor wants and what are their expectations. We now raise more from grants than we do from donors. This comes from owning the issue and owning the results nd being very clear about that.
Viki: In our latest incarnation, we are a data commons. Our primary interest in making sure information is linked to action. The assumption from some of our paartners is that not much action is taking place. We need to interest our partners in understanding that we are bringing the information to inform the action in the community. We see that those people who have a sustainability agenda are interested in sharing their accountability and responsibility and showing how they are contributing to the solutions and successes of the indicators. We're linking that to the indicators themselves on the website.
Jon: We haven't been very successful in raising money because people think we have enough money. Ultimately, this should be funded through government, because this is a public good. We need to point out that if we didn't do this, somebody else would have to, but probably not as well.
Andrea: Tell me about your failures.
Ben: We had early difficulties with our public school system, who saw our indicators as one more assault in a long history of attacks on public education. Naturally, they responded defensively. Data is too often perceived as a weapon until you build joint relationships of trust. Through relationship-building, we helped them see that our intent was not to attack them, but to shift educational issues from being "their problem" to a community problem, which allows for the greater community to take part in the solutions (and to take responsibility for creating success.) This was a much more effective approach.
Jon: We had an initiative around family violence. We saw that the indicators weren't strong enough to make the changes and build the civic/community will around the issue at this time.
Viki: When I think about the trajectory of Sustainable Seattle and the capacity we had developed, we probably didn't have the capacity to sustain a bigger strategic vision. For a while, we lost direction and were in search of that strategic vision.
Jon: Projects tend to fail when they haven't engaged the community from the beginning. You need to build that ownership from the beginning, even if it takes time. Perceptions of bias can be extremely damaging. Because we had built the shared ownership, others could defend us.
You don't want to sensationalize these things, but if you let statisticians write these things they end up really boring. Don't underestimate the difficulty in presenting this information to people.
Andrea: We have had plenty of struggles in getting the Arizona Indicators project off the ground. We tried so hard to be nonpartisan and not interfere and tried to let the numbers speak for themselves. However, the data by themselves were not compelling. People wanted to know both the what and the why, and they need the why in order to hear a call to action.
The closing plenary was facilitated by Adam Luecking. They divided the room into three groups, and asked: What is your vision for CIC in the next 2 years?
Lots of great discussion ensued. I'll let the CIC Board formalize that conversation, but the energy (one person said "vibe") in the room was palpable and positive, which is a great way to end a conference.
Community Indicators for Your Community
Real, lasting community change is built around knowing where you are, where you want to be, and whether your efforts are making a difference. Indicators are a necessary ingredient for sustainable change. And the process of selecting community indicators -- who chooses, how they choose, what they choose -- is as important as the data you select.
This is an archive of thoughts I had about indicators and the community indicators movement. Some of the thinking is outdated, and many of the links may have broken over time.
Saturday, October 3, 2009
CIC Conference 2009, Part Two
Posted by Ben Warner at 6:00 AM
Labels: community indicators, Community Indicators Consortium, conferences, Jacksonville Community Council Inc.
Subscribe to:
Ben: I stumbled across your blog thanks to google alerts and am really glad I did. There's so much here - I almost feel like I attended the conference. Thanks. It will take me time to go through - plan to share with colleagues who will certainly benefit as well.
ReplyDelete