Community Indicators for Your Community

Real, lasting community change is built around knowing where you are, where you want to be, and whether your efforts are making a difference. Indicators are a necessary ingredient for sustainable change. And the process of selecting community indicators -- who chooses, how they choose, what they choose -- is as important as the data you select.

This is an archive of thoughts I had about indicators and the community indicators movement. Some of the thinking is outdated, and many of the links may have broken over time.

Monday, September 29, 2008

Sustainable City Rankings

There's a new set of sustainability rankings out from SustainLane.com. The rankings are based on a set of 16 indicators of sustainability, from water supply and land use to traffic congestion and green building practices. The complete methodology, including data sources, can be found here.

I'm not charmed by ranking systems in general, and a weighted index doesn't really get my heart pumping. Too much subjectivity posing as objectivity. But I really, really liked the site's openness in providing not only the methodology (including the indicators and the weights, with some rationale for the reasons for the weighting), but also the data sources used (and additional reading/resources where applicable).

This particular ranking was brought to my attention by the reactions of people in my community (Jacksonville, Florida) to their placement in the rankings. Local environmentalists, on seeing Jacksonville ranked 23rd out of 50, were "not ecstatic" (see article in The Florida Times-Union.) Apparently the ranking gave credit for good intentions -- policies enacted -- before the results of a cleaner environment were measurable (or even much had been done.) While we've been pretty pleased to see the policies put in place (and have had a hand in doing so -- see the followiing studies: 2007: Air Quality: Energy, Environment and Economy; 2006: River Dance: Putting the River in River City ; and 2002: Making Jacksonville a Clean City), we want to see the positive outcomes as well.

Take a look at the rankings, and see if your city is there. Then let me know -- did they get it right? And did you get any good ideas for better indicators/data sources in the process?

Read more ...

Friday, September 26, 2008

Update from Yampa Valley

A few years ago I had the opportunity to work with Audrey Danner of Yampa Valley Partners on a Civic Indicators project for the National Civic League, and I think she's one of those really good people that you get to meet every now and then if you're both really good and pretty lucky.

That being said by way of introduction, I was a little disappointed in the Routt County Commission's decision not to remain as a financial partner in the work of Yampa Valley Partners. However, the discussion about the funding decisions and the desirability of government buy-in to local community indicators work is spot on, and I thought I'd draw your attention to this news article describing the situation.

We all face funding issues from time to time, especially as local governments struggle with multiple priorities and considerable needs. (Ironically, it's often our community indicators work that points out where those needs are and how significant they are to the community, which makes it bittersweet when those needs are prioritized ahead of the research/planning functions we provide -- if only we were less competent at placing these issues on the community agenda!)

But this response by Audrey Danner is right on message:

Audrey Danner, Yampa Valley Partners executive director, said her board plans to continue its services despite its funding setback. The most discouraging issue, she added, is losing Routt County as a partner, not its money.

“Our local government funding is very important, and it’s discouraging we will not have Routt County funding us next year,” Danner said. “It is not only about funding, though. It is about partners coming together for their communities … and being effective as a regional group when we make decisions.”


If you're in the Yampa Valley region in Colorado, please consider supporting Yampa Valley partners. (Audrey doesn't know I'm saying this.) Even if you're not, pay attention to what they're doing -- they're a shining light showing how a rural area can come together around shared issues for the community good.

Read more ...

Thursday, September 25, 2008

Data Integrity

We've discussed metadata and data integrity issues before -- see especially this blog post for the drunks+lamppost analogy -- and have been advocating for better metadata standards generally.

Now Swivel launches a discussion of the authenticity of data that reminds us why this topic is so important. On their site, less than a third of the data is from an "official" source -- 69 percent is entered in by various site users. Hopefully, each user has entered in a data source along with the data, and I would hope no one would use Swivel as a primary data source anyway (any more than citing Wikipedia as a primary source.) Swivel is a great place to find interesting data and then chase it down to its source, but any site that aggregates other information shouldn't be your primary resource anyway.

The article on Swivel's blog points to Technorati Authority numbers as a way of looking at data reliability -- if lots of people cite it, it must be good, right? While I appreciate the attempt to point out an emerging rating standard on the internet (this blog's authority number as of today is 9, I was surprised to discover), the strength-of-numbers argument is remarkably not compelling.

So how do we establish a better standard for pointing to the integrity of data and the authenticity of the information presented? Any ideas?

Read more ...

Tuesday, September 23, 2008

Quote of the Day

"Statistics are human beings with the tears wiped off."

- Yank Coble, physician and director of the Center for Global Health and Medical Diplomacy at the University of North Florida.

Read more ...

Monday, September 22, 2008

Tuolumne County Community Indicators

Check this out -- a news article about a community indicators project that is part one of a 26-part, nine-week series all on community indicators.

Pretty cool, huh?

The report is the 2008 Sonora Area Foundation's Tuolumne County Profile Community Indicators Project. You can click here: Community Indicators Project for a summary of the entire 2008 Tuolumne County Profile.

What kind of relationship do you have with your media? How well do they cover your community indicators project? Any tips to pass on?

Read more ...

Sunday, September 21, 2008

Sustainable Seattle

We've talked about the good work of Sustainable Seattle before, but I thought you might be interested in seeing this person's reaction to their model of sustainability.

And it's a good time to mark your calendars for the upcoming launch of B-Sustainable, coming in October 2008.

Read more ...

Sustainable Peterborough

I ran across this interesting primer on sustainability and sustainability indicators, and thought I would share.

It's a nice introduction to why sustainability matters, what sustainability means, and why sustainability indicators are important. Take a look and let me know what you think.

Read more ...

Saturday, September 20, 2008

Comparables, Context, and Community Indicators

As I write this, I'm at cruising altitude looking down as the morning sun highlights land-use development patterns that appear remarkably standardized from the ones I saw yesterday in another airplane heading toward a completely different part of the country. (I'm spending much too much of my time in airplanes these days.) So when I look down from above the clouds and can't tell if I'm above Tallahassee or Thomasville, my mind naturally turns to ... community indicators.

More specifically, I start thinking about some comments yesterday about comparability and scalability of indicators, especially as it relates to the development of the State of the USA project. The Community Indicators Consortium was hosting a conversation between SUSA and community indicators practitioners, all held over at the Urban Institute offices in D.C. (special thanks to Tom Kingsley and Kathy Pettit of the National Neighborhood Indicators Partnership for providing meeting space and refreshments.) The discussion flowed easily, as Charlotte Kahn of the Boston Indicators project facilitated exploration into civic engagement strategies, technology and tools, and indicator development. I deeply appreciate the opportunity to participate in such a thoughtful discussion.

One challenge faced by communities as they design their indicator efforts is that of comparability. If the local unemployment rate is 5.5, as someone in one community asked me, is that good or bad? That's not an easy question to answer. I think we've talked about this before, but I'll recap the issues around comparability and context.

One option for communities to answer “is that good or bad?” is to develop a peer comparison set. This would be a set of roughly similar communities, at least in demographics or other key points, and compare one's own indicators to the results from those communities. This poses several challenges, including selecting which communities could act as reasonable peer comparisons, finding indicators that are measured the same way throughout the peer community set, and avoiding significant confounding factors that make the comparisons irrelevant or problematic. For example, if a peer community just faced the closing or transfer of a large employer, that spike in unemployment may not be a fair comparison. At the end of the day, however, using peer comparisons doesn't answer “is it good or bad?” It can only answer “is it better or worse than someone else?” The danger is the TGFM effect (Thank Goodness for Mississippi): there's always somebody doing worse, and so your indicator may be met with complacency, depending on the choice of comparables. In fact, things might be getting worse in your community, but as long as things are getting relatively more awful somewhere else, you can feel pretty good about doing poorly. Not an ideal recipe for change.

The other side is also true: real progress in the local community can be overshadowed by more progress somewhere else, turning celebrations into discouragement. The benefit of comparables, however, is to avoid local complacency in making incremental changes when the rest of the world is making rapid progress on an issue. Ken Jones, of the Green Mountain Institute, points out that making small steps toward reducing lead poisoning in children would be a travesty, since most communities have made huge leaps in lead reduction and childhood exposure. Only by examining the context in which these numbers are changing can you get an accurate read on what your progress means.

A second option for communities to answer “is this good or bad?” is to compare themselves against themselves – and against a community vision. This is the decision Jacksonville made with its indicators project 23 years ago. So the question of the unemployment rate becomes “is it better than it was last year?” and “how far away are we from our goal of full employment?” The advantage of this approach is that communities look inward, measuring themselves against their own standards of where they want to be, and can galvanize action or celebrate success accordingly. The disadvantage is that this approach can miss significant national confounding factors out of the control of the local community, or the community might not reach far enough or fast enough for improvement when the rest of the country is moving rapidly toward success.

A third option, and one that Jacksonville is moving toward, is that of using peer communities and national measures as context, not comparison. This means not asking “are we doing better than Nashville?” but instead “what does our trend line look like in the context of the shared movement of peer communities?” The question is not “where do we rank among these cities?” but instead asks what is happening across the country in relationship to the indicator being measured, and how can we understand local efforts and progress better.

This approach allows us to consider the national decrease in teen birth rates or violent crime and see if our local efforts at teen pregnancy prevention are making progress beyond that which would have been otherwise expected if we were just following national patterns. It allows us to see the local murder rate in context – although it is lower than it was a decade ago, something is happening in our community that makes it more violent than it should be if all else were equal. These are much harder questions to ask, and require more work to put together the information to provide useful context. But it may be getting easier, if a few new national project provide the tools for selecting peer comparables like I hope they will.

Imagine, if you will, being able to select your own comparable communities from an extensive national dataset, based on the sociodemographic variables you select. That would be useful, yes? Now imagine being able to select comparable communities based on a wider set of indicators than just socioeconomic status or demographics. Imagine putting together different peer sets depending on which variables/indicators are under consideration. Now imagine adding a time dimension to the equation, so that you could look at communities that five years ago were where your community is today on an issue, and then see which communities improved significantly. How much more useful is that to see who's actually solved the problem your community faces today? How exciting would it be to then examine their public policy approaches, their human service programs, or their community initiatives to discover what they did that made a difference, and then to replicate that success in your community?

This is, I think, one of the fascinating new possibilities that scalable, searchable, comparable national indicator data sets will be providing in the next few years. I think it will greatly enhance the capacity of local communities to understand their own indicators in a larger context and to identify successful practices for community change. And it is for community improvement that we measure all this stuff, isn't it? We want things to get better, and we know that they get better faster if we make decisions based on accurate information.

You may think my head's in the clouds. Right now, as I look out the airplane window, you would be right. But that won't keep these tools from being developed, and soon. More to come later ... right now, my tray table needs to be placed in an upright and locked position.

Signing off somewhere above a patchwork America.

Read more ...

Thursday, September 18, 2008

Queensland Happiness Index?

This month has required (and will continue to require) a great deal of travel, and my laptop blew up on Tuesday -- got it fixed, but it's slowed down the frequency of postings to this blog. The good news is we're still committed to providing you all the news and updates about what's happening in the world of community indicators that pass across our desk.

We've talked about Bhutan's Gross National Happiness Index several times. Now Geoff Woolcock, at Griffith University in Australia, is leading an effort to create a Queensland Happiness Index.

After a quick headline-grabbing reference to a "smile-ometer," the Courier Mail quotes Professor Woolcock as saying,

"An effective community indicators system that monitors the sense of community and belonging, quality of collective life and, more broadly, human rights and equity is more likely to produce policies that reflect what matters to most people."

In conversations with the good folks at the New Economics Foundation, they talked about how they're trying to take their communities from a conversation about happiness to exploring sustainability. Real community happiness, they suggest, will only come about when we protect the environment and think of sustainability as a key factor in our quality of life. Our community, on the other hand, began with quality of life (specifically and intentionally focusing on the external environments and not on happiness), and is moving from sustainability conversations to those of happiness.

I'd be interested in hearing from your experiences in your community. Do you talk about happiness much, as part of a community indicators project? Do measures of happiness fall within your assessment/indicators framework? If not, are you getting closer to those discussions?

Drop me a line and let me know. It would make me happy ....

Read more ...

Thursday, September 11, 2008

AGA's Fourth Annual Performance Management Conference

News from Perspectives on Performance (www.agacgfm.org/performance/newsletter/0908.aspx)

There's Still Time to Register for AGA's Fourth Annual Performance Management Conference

Join your colleagues at AGA's Fourth Annual PMC, Promoting Government Accountability through Performance Management, set for Oct. 27-28 at the Renaissance Seattle Hotel. If you have not made plans to attend this year's PMC, worth 14 CPE hours, you will miss:

  • Robert Attmore, CGFM, Chairman, Governmental Accounting Standards Board, Harry Hatry, Urban Institute, and Paul Posner, George Mason University on "We've Come a Long Way, Baby--Forty Years of Performance Management and Reporting"
  • Robert Shea, Associate Director for Administration and Government Performance, U.S. Office of Management and Budget Ron Sims, County Executive, King County, Washington
  • Brian Sonntag, State Auditor, State of Washington

This year's conference includes the traditional state and local track of concurrent sessions, and the technical committee has added a federal track of sessions.

Current Sessions:

  • Lessons Learned for the Next Administration
  • Activity-Based Costing and Performance Management
  • Basic Ingredients for a Successful Performance Management System
  • Transforming the Budgeting Process to Deliver Citizen Value
  • Performance Management in Reporting Technology
  • Employee Engagement in Improving Organization Performance
  • So You Want to Get Started--Practical Steps from Preparers
  • Citizen-Centric Reporting--You Can Do It!
  • What's New with Performance Reporting Programs?
  • What Are Community Indicators and Are They Necessary?
  • Integrating Financial and Performance Information

Early registration through Sept. 29, 2008 is $395 for AGA members and $450 for nonmembers. Visit our website for more information.

Lodging--Rooms at the Renaissance Seattle Hotel, 515 Madison St., are available at the government rate of $152 plus tax if you register by Oct. 3. Call 800.546.9184 and mention the "AGA PMC" to receive the government rate.

Two Sponsorship Opportunities Available--Interested in participating as a sponsor at the PMC? Contact Evie Barry.

Training Session Offered After Performance Management Conference

If you are attending the PMC in Seattle in October, you might be interested in participating in a training session on Wednesday, Oct. 29: A New Service Model: Auditor Roles in Government Performance Measurement. Richard Tracy, former Director of Audit Services, Portland, OR, will lead the training on using an organized framework to determine which specific practices provide the best opportunities for adding new value. The training, worth 4 CPE hours, is designed to show how government auditors can increase their value by taking on new roles and practices aimed at improving performance measurement and management. Cost is $50. Seating is limited. Contact Evie Barry for more information.

Read more ...

Wednesday, September 10, 2008

GAAP Changes and Community Indicators

You may have heard the news already -- the U.S. Securities and Exchange Commission is moving towards adoption of IFRS accouting standards in lieu of the GAAP. I bet it was a topic of water-cooler conversations and cocktail parties across the world.

Well, maybe not. Unless you're an accountant, the movement of the U.S. to an international set of accounting standards likely won't affect you in the least. I only noticed because of a comment someone made about the announcement, one that seems to have some bearing on the work underway now to help grow community indicators around the world.

In Closing the Information GAAP, Gordon Crovitz writes:

As technology has shown in other areas of life, agreed-upon standards and accepted operating systems drive usage and efficiency. Common measures add value to information. If even the belt-and-suspenders accounting profession is willing to take on the risks of switching its basic system for assessing businesses, we're truly in an era when anything that adds to understanding belongs in the asset column, while anything that undermines transparency is a liability.

Now you see why I thought this was of interest to community indicators practitioners. As we encourage open sharing of information, strong metadata standards, and reliable reporting of indicator trendlines, we're part of a larger trend that sees information as a vital community resource. The better we get at developing shared standards for quality data and reporting, the more we can advance the use of information in community decision-making.

And in doing so, we can encourage the democratization of data, so that information is in the hands of the full community, and not just a privileged few.

What are your thoughts?

Read more ...

Creating Community Indicators Reports

The last couple of weeks, I've been working with several communities in developing their first community indicators reports. After these reports move from draft stage to publication, I'll provide links so you can see what these communities are doing. I'm quite impressed with the depth of thinking and creativity that has gone into these projects, which clearly reflect the priorities and issues specific to their people and geography.

Seeing multiple projects come together at the same time was a welcome reminder of the importance of local input and local variability in community indicators. I applaud, as I'm sure you do, the work of the State of the USA project, which will be of tremendous benefit to local community indicators work, especially if the indicators reported are scalable local through national. The project will accelerate the use of data in decision-making and highlight the need for community indicators, as well as provide welcome data resources, spotlight the importance of tracking data for issues where data gaps exist, and decrease the upfront costs in developing local indicator sets.

The concern is, of course, that a national set of indicators may be seen as creating a homogenous template against which all communities can be compared, ranked, and tracked. The local projects I've been working with have shown once again the incredible variability and strength that comes through local communities who define for themselves what's important and how they want to measure it.

In talking with one community yesterday after reviewing their draft report, I was asked for suggestions on how to release the report to the community. I'll share with you my response, but I really would like your input as well. After a community develops a set of indicators, creates a framework for sharing them, and researches the trend lines and plugs in the data, what's an effective way to take that set of numbers and charts and graphs and present them in a way that informs, excites, and inspires a community to action?

My reply, in part, was: You have data. You have trends. You have a constellation of [a certain number of] indicators telling you something about [your community.] So what’s the story? What picture does this draw for your community? What has been the reaction of the steering committee to the numbers?

If a news reporter showed up with a camera and a microphone and asked [the community volunteer who chaired the indicator effort] what the report said about the region, what’s the 15-second sound bite?

(I see some really interesting stuff in the report, but the key story really should be drawn from the steering committee’s reaction to the data … and then from that reaction, the key phrase, quote, talking points, picture, and story can be drawn to make the data resonate in the community. [some examples here] There are several stories that jump out – what did the committee see as most important?)

[C]apturing the “a-ha” moments from the committee will be important in describing how the report is framed/translated for the community at large. The key questions to consider often are:
1. What does the data tell us?
2. Why should we care?
3. What should we do with that information? or How does that information affect decision-making in the region?


So what did I not tell them that they need to know? What's your advice to a community presenting their report for the first time -- how can they engage the community around the data in ways that make a difference?

Read more ...