Read more ...
For an interesting look at key indicators of economic well-being, see this article from TIME Magazine's Brad Tuttle. In it, he refers to this Washington Post graphic/article on an Underwear Index, showing that as times get tough, men wear underwear longer and the sales of this clothing necessity dip.
Tuttle also (in another blog article) highlights the things we buy more of during a recession.
It's another creative indicator set ... but I don't think we'll be using this indicator in our community reports any time soon.
Please keep sending me the interesting metrics you find out there!
(Hat tip goes to Elaine Pace)
Community Indicators for Your Community
Real, lasting community change is built around knowing where you are, where you want to be, and whether your efforts are making a difference. Indicators are a necessary ingredient for sustainable change. And the process of selecting community indicators -- who chooses, how they choose, what they choose -- is as important as the data you select.
This is an archive of thoughts I had about indicators and the community indicators movement. Some of the thinking is outdated, and many of the links may have broken over time.
Monday, August 31, 2009
Read more ...
Wednesday, August 26, 2009
Thanks to Abel Harding and his JaxPoliticsOnline blog for this video of yesterday's press conference at City Hall, where local nonprofits used data to make their case that city funding for nonprofits not be slashed during these difficult financial times.
I thought you might enjoy the different ways the numbers are presented and displayed -- comparisons to familiar terms, straight-up statistics, visual representations of the numbers, and more.
At Indicators Session Report-Out, you can see the working notes of a community indicators project being developed in Buffalo, New York. The actual project site is at creativelysustainable.com.
I enjoy seeing these projects in their development stages, before the shiny graphics and glossy report covers get put on. This is community organizing to determine what's important, finding ways to measure it, and then using those measures to make sustainable change for the future -- how wonderful!
The discussion they had highlighted a number of reasons why they felt community indicators were important:
- Increased community awareness
- Can’t improve what you can’t measure
- Can’t learn from the past if things are not recorded
- Can’t unite people if they don’t have similar understanding
- Helps to diffuse myths and misperceptions
- Let’s people have a way to know when to celebrate progress
- It improves decision making quality
Greensboro, also, developed a set of reasons why they needed a community indicators system:
The most important and compelling reason Greensboro should implement a Community Indicators System is that the Connections 2025 Comprehensive Plan specifically directs the City to "create an ongoing housing and neighborhood condition monitoring strategy" (Policy 6A.3). Beyond this mandate however, a Community Indicators System is a valuable tool for:
• Measuring quality of life;
• Monitoring provision of public services;
• Identifying disparities of opportunity; and
• Promoting responsive, accountable, effective, efficient, and equitable government.
Whatever the reasons, I like seeing more community projects grow. If you're in either the Buffalo or Greensboro areas, why don't you see if you can lend a hand to these efforts. If you're not, take a look at what they're doing and lend them your moral support -- this is important stuff we're doing! Read more ...
Monday, August 24, 2009
I don't know if you've seen the Government Assessment Portal (GAP) Newsletter yet, but I'd like to draw it to your attention because of the creative ways people are measuring government performance.
Articles in the newsletter include:
- Bhutan: Planning for happiness
- Angola: On its way to better local governance
- Chile: Taking stock of democratic progress
- Workshop in Cairo: Assessment Methods and Application of Governance Evidence
- Workshop in Namibia: Call for participants
- Work in the Pipeline
Saturday, August 22, 2009
The Council of Community Services, in partnership with the United Way of Roanoke Valley, just released their second annual Roanoke Regional Community Indicators Report (PDF).
From their website:
The report updates key household economic indicators, presentations of valuable data that show change over time, for eight jurisdictions in the Roanoke/Alleghany Region. The report tracks the original 39 indicators and adds two key education indicators for a total of 41 indicators. Job related, education, income and asset related, housing, public assistance, low income and other categories of indicators are included in this updated, data rich, 2009 Edition.
The Council of Community Services is now tracking a limited number of economic indicators on a monthly basis for the Roanoke/Alleghany Region. This report, 2009 Economic Indicators: Monthly Trends (PDF) provides valuable indicator data for the eight jurisdictions in the Roanoke Region. The data covers the period from January to June 2009.
They've done a nice job of selecting indicators on a regional level and reporting them for the region and for each of the eight jurisdictions the region covers. For policy-makers and planners, this is an important tool for their region. They also do a nice job of taking advantage of the new ACS 3-year estimates for smaller-population jurisdictions. If you're not using this tool, you may want to check out how Roanoke can capture important information that was simply not available before outside of the decennial census.
Take a look!
(Read more about the report at The Roanoke Times.)
Friday, August 21, 2009
... and yet it's funny. Check it out at Wow Factor Added To Corporate Presentation:
Now that they've added the final touches of wow factor to Thursday's presentation, employees seem confident their pitch is a can't-miss.
"To think about what our presentation was before it had that sort of, you know, that wow factor, is kind of embarrassing," account supervisor Scott Weston said. "It was mostly just slides of straight statistics and comprehensive charts that explained very plainly that in order to more successfully get the word out about our clients, we should utilize all types of media."Added Weston: "It would have been over in 30 seconds."
Nice send-up of our tendencies sometimes to overdo the data visualization in ways that obscure the message.
(Hat tip: The Extreme Presentation(tm) Method) Read more ...
Read more ...
In Aloha Analytics, Brad Parsons summarizes the 2008 Kaua'i Community Indicators Report (PDF) released last month.
The report, put together by the Kaua`i Planning & Action Alliance, has 57 indicators in 7 sections, and was intentionally created to measure progress toward the community vision developed in the Kaua'i General Plan 2000. I like how they begin the document with this quote:
The future does not just happen to us. We, ourselves, create it by what we do and what we fail to do. It is we who are making tomorrow what tomorrow will be. For that reason, futurists think not so much in terms of predicting the future, as in terms of trying to decide, wisely, what we want the future to be.
– Edward Cornish, Editor, The Futurist magazine
The Foreword says:
This 2008 report, Measuring What Matters for Kaua`i, is the second study of Kaua`i community indicators. It tracks 57 indicators that explore the quality of life on the island, the strength of Kaua`i’s economy and the health of its environment. The report was created to provide qualitative and quantitative information on important facets of Kaua`i for those who make decisions about policies and the allocation of resources that affect the lives of residents and the `aina.
(I looked up the term 'aina and it refers to the earth or the land.)
Each indicator has a short statement describing the trend, a narrative section on Description and Relevance, another narrative section describing How is Kaua'i Doing?, a table, an Indicator Chart (or two), and sometimes a Status Chart. The last two were of most interest to me -- sometimes they compare themselves to themselves over time with a trend line chart, and then add a second chart (the Status Chart or a second Indicator Chart) to explore the same indicator from a slightly different angle.
Sources for the data are included, usually in the table section, and often include a web link for more information.
The report is strong on analysis and depth of data. My only regret is that it could be prettier -- more visually inviting to the reader. Something of the grace and natural beauty found here. Instead, this is a 112-page manual for the serious wonk, and it's a strong addition to the community indicators field. Take a look.
Thursday, August 20, 2009
I just took a look at The 2009 Report Card on Child Well-Being for Austin/Travis County, Texas (PDF) and had to pass it along for the rest of you to take a look. It's three pages long, and it divides a series of 13 indicators of child well-being into 3 categories: Healthy Indicators, Happy Indicators, and Smart Indicators.
What I found most darling about the indicators was their method of marking progress. Instead of a stoplight scale (red-green-yellow) or up-or-down arrows or right-to-left gauges, they provided the following:
I think this challenges us all to find the right way to communicate our message to our intended audience(s). Congratulations to the Capital Area United Way for a job well done! Read more ...
Wednesday, August 19, 2009
I was asked recently about how community indicators are actually used in a community to make things better. The context of the question was this: Someone new stepped into an organization that had an existing (and very good) community indicators program. She interviewed key stakeholders in her community and found two general responses: first, that people knew of and appreciated the community indicators project; second, that no one was quite sure what to do with it.
I suspect this is a question most of us face in our communities. What do you do with these indicators reports? Let me share with you an edited/expanded version of my answer, and get your feedback on anything I left out or need to clarify.
I firmly feel that community indicators need to serve as an integral part of a community change model. I've talked about that before on this blog, and I'll likely be talking about it again. We don't do indicators because they're pretty, or cool, or nice to have (though they are all of the above.) We do indicators because we want to help make the community better.
This community change model, the way we use it in Jacksonville, begins with a shared vision for the future. You have to have at least a broad idea of where you want to go.
The indicators then measure where we are in relation to that vision – are we getting closer to the desired future, or farther away. The indicators are not just about the specific aspects they measure – high school graduation rates aren’t just about high school seniors, but provide an overall proxy measurement of the effectiveness of the entire school system to achieve the desired results. The indicators point to areas of progress and areas that need greater community attention.
But the indicators themselves don’t identify solutions. They are descriptive, not prescriptive. They flag items of community concern, not just because we care about (in this example) education, but because this is an area we need to work on to improve our overall community and get closer to our shared vision. They set a general community agenda for action.
When the indicators identify priorities for community action, they need to trigger some sort of planning process. The indicator says something’s not working right in our current systems – what do we do about it? This planning process needs to identify strategies for action. Who will do what to address the issues identified by the indicators?
These plans need to be implemented. Someone needs to make sure the community follows through. This is where the indicators inform the decision-making processes of community institutions, from government to nonprofits to business entities and faith institutions and anyone else with a vested interest in the community. This is why the “shared vision” part is critical – to achieve the shared vision will usually take shared action, and people/institutions stepping out of traditional roles to address the shared community priorities identified by the indicators. This is one of the great strengths of the community indicators process: moving a community issue from “their problem” to “our problem.”
The actions will have results, and those outcomes need to be identified. The real success of the efforts to improve the community should be evaluated/assessed through the indicators themselves – did our strategies work? Did we make a difference? Based on what the indicators tell us, we may need to revisit our vision, rethink what we are measuring, or go back to the drawing board and develop new strategies for action.
That’s both why indicators are so important, and how to use them effectively:
1. To identify shared priorities for action;
2. To inform strategies and community planning efforts;
3. To hold the community accountable for action;
4. To assess results and evaluate success; and
5. To measure progress toward a shared community vision.
That’s the general overview. There isn't one right way to make them work, or one right set of actors to use the indicators to make change. In Jacksonville, the community indicators are used:
1. To inform philanthropic grant-making efforts
2. To develop Chamber of Commerce focus actions
3. To create the annual curriculum for our Leadership Jacksonville programs
4. To identify topics for public television/radio programming
5. To support grant-writing efforts
6. To influence legislation and policy-making at the local, state, and national level
7. To ask for community investment and increased corporate responsibility from local corporations
8. To bring together collaborative partnerships around an issue and break down silos
9. To highlight areas for further study, both in our own organization and in academic and other research areas
And more. As the indicators become institutionalized in the community, their influence broadens, but the role of the community indicators practitioner, besides sharing the data, is to help insure that they are being actively used to galvanize action.
How we do that is the subject of another blog post.
The folks at Swivel's blog, Tasty Data Goodies, have put together an interesting list of their favorite blogs.
It's good list of data and data visualization groups. Many of the sites they list are on the blogroll on the right-hand column of this site, and I may add a couple more based on their recommendations.
Some of the blogs I follow that aren't on the list are:
GraphJam: It's a place for user-created graphs that tend to be on the silly side, but a lot of fun anyway. From the lolcatz folks.
Indexed: Sometimes humorous, sometimes thoughtful, Jessica Hagy draws graphical relationships on index cards. She's now published a book of these.
StrangeMaps: Not always data, but some interesting ways of looking at the world and representing it graphically. Worth a look.
The AGA's blog often has quite a bit to say about government performance measures. The guest authors are usually quite thoughtful in their presentations.
I also link to the blogs of PolicyMap and InstantAtlas as two of the more interesting tools out there for reporting/presenting community indicators.
There's more -- take a look at the blogroll -- but these I especially like. I'm interested in the blogs you follow that have something to do with community indicators -- or the blogs you write. Care to share? What have I left out? Which of these blogs are your favorites?
ETA: Don't forget this great list of blogs from Nathan Yau last May -- there's a wealth of resources available out there for us!
Tuesday, August 18, 2009
This may be of interest to those, like me, who are in the process of updating their community indicators.
Release Schedule for the 2008 ACS and 2006-2008 ACS Data
On September 22, 2009, the Census Bureau will release the 2008 ACS 1-year estimates. Similar to last year's release, the 2008 ACS will include 1-year estimates available for the nation, 50 states and the District of Columbia, Puerto Rico, every congressional district and all counties, places and metropolitan areas with populations of 65,000 or more. Included are the ACS income, earnings, and poverty data as well as all other ACS estimates on social, economic, demographic, and housing characteristics.
On October 27, 2009, the Census Bureau will release the 2006-2008 ACS 3-year estimates, based on data collected from 2006-2008, for all geographic areas with populations of 20,000 or more. Included are the estimates on social, economic, demographic and housing characteristics.
Additional ACS Tools and Information - Coming Soon
Later this summer and into the fall the Census Bureau will provide tools to aid ACS data users. These tools include guidance on making comparisons between the 2007 and 2008 ACS data, ACS table shells, and a new e-tutorial to assist novice ACS data users.
If you have questions or comments about the American Community Survey, please call (800) 923-8282.
Here's our good friend Dilbert, on benchmarking:
That's an ... interesting definition. But it's a starting point to talk about the problems with indicators and comparisons.
It's amazing how much history, personality, past decisions, key individual leaders, social structures, civic involvement (among people and different types of community institutions), legacies, traditions, and a thousand other factors influence the success or failure of community improvement efforts. Pick a topic -- teen pregnancy, high school graduation rates, economic self-sufficiency, crime prevention -- we've had a thousand communities wrestle with these problems and have their combined experiences to draw upon.
So we know something about what has worked (and what hasn't) in some places and for some issues. But implementing the same program/process/policy in a different place often leads to less-than-expected outcomes.
I suspect the art of community improvement has to be informed by the data, the research, the combined experiences of those who have had success and those who have not. But the factors that lead to success often remain left out of our narratives -- the trust factors and relationships that made it work, the key driving personality(ies), the community cultural backdrop against which the drama played out.
Over the years, I've been gaining a greater appreciation for our story in Jacksonville, Florida. My organization is about to turn 35 years old; our indicators project, 25. But the story of our success begins, in part, in our community's response to a smallpox epidemic in 1883. And the structures in place that created the community conditions that allowed a different kind of response to the disease predated that effort by generations.
Does that mean our successes in Jacksonville can't be replicated? Of course not -- hundreds of communities have shown they can do what we have done, and more. But those communities brought their own stories and histories and people and institutions together and developed their own organizations and their own indicators in ways that never cease to delight and teach me.
As we develop our own success stories and case histories, may I suggest thinking about both the principles/processes/projects that led to success, and also the context in which those principles/processes were put into action. Both, I think, are critically important to understand, as we increase the knowledge base necessary to make communities better.
Thursday, August 13, 2009
Call for Nominations:
Community Indicators & Performance Measures Integration Awards
The Community Indicators Consortium is inviting nominations of efforts to integrate community indicator and performance measurement projects (US or international) that best demonstrate the development and application of integrating these two forms of measurement to advance sustainable change in their communities and the power of this integration to drive that change.
The purpose of the Awards Program is to
- recognize the collaborative efforts among community stakeholders to integrate community indicators-organizational performance measures projects and the people who made it happen and
- add to the public body of knowledge about community indicator-organizational performance measurement integration promising practices.
Allen Lomax and Cheryle Broom
CI-PM Steering Committee Co-Chairs
Community Indicators Consortium Read more ...