Community Indicators for Your Community

Real, lasting community change is built around knowing where you are, where you want to be, and whether your efforts are making a difference. Indicators are a necessary ingredient for sustainable change. And the process of selecting community indicators -- who chooses, how they choose, what they choose -- is as important as the data you select.

This is an archive of thoughts I had about indicators and the community indicators movement. Some of the thinking is outdated, and many of the links may have broken over time.

Friday, June 26, 2009

Are your Community Indicators making a difference?

Yesterday I led a lunchtime conversation via webinar on the question, Are your community indicators making a difference? The webinar was sponsored by the Community Indicators Consortium and was a members-only event, and I know several dozen of you were disappointed in not being able to attend. I thought I'd summarize my notes for both the attendees and those who missed the event, and continue the conversation. (Plus you ought to join CIC to not miss out on their next webinar!)

For the webinar, I'm speaking from the experience of an organization that is currently working on its 25th annual community indicators report. We've seen a generation of community leaders who have stepped into leadership roles that have always had our indicator reports there to guide them. Along the way, we've learned a little bit (through many trials and lots of errors!) about how to tell if your indicators are being effective.

I tried to organize my remarks this way:

One topic: Measuring the effectiveness of your indicators project
Two key questions: Who is your intended audience and what are your intended results?
Three meta issues: Design, Timing, and Source
Five areas to measure results

(I know there's no four. Feel free to chime in with what I missed.)

Let's jump to the two key questions: intended audience and intended results. Defining your audience is not easy work, but it is critical forthe rest of the discussion. Are you producing your indicators for elected officials? For public officials (the non-elected ones behave differently than those who need to campaign for their positions)? For community activists? For statisticians and data professionals? For chambers of commerce and business groups? For United Ways, community foundations, or other funders of non-profits? For grantwriters? For non-profit organizations and service providers? For everyday citizens? For students? For the media?

You may want everyone to use your indicators. I know the world would be a better place if everyone read and internalized every report I produce. But who is/are your primary audience?

And what do you want them to do with the indicators? Possible intended results include:
  • Inform/ educate/ raise awareness
  • Build shared priorities
  • Shape decision-making
  • Influence budget allocations
  • Define public policy
  • Inspire action
  • Demand accountability
  • Measure performance/outcomes
And there are more possibilities. Before we can deal with the big question -- are your indicators making a difference -- we have to be able to answer these two key questions -- who do you want to do what with your data.

In my organization in Jacksonville, the question of indicator effectiveness is driven by a Model of Community Improvement. It's our "theory of change" that explains why we do indicators and what we hope to accomplish with them. I'll include the model below:

Briefly, we suggest that change begins when we identify what change we want -- we create a vision for the future, based on our shared values in a community. (I know we like to think data are objective, but every indicator we include in our reports is a value judgement, as is every include we don't include. Every desired direction in a trend line is a value judgement. Go ahead and begin by articulating the values, instead of assuming implicit agreement on them.)

In order to know where we are as a community in relationship to that vision, we develop indicators. These community indicators then help us determine where we are falling short, what our priorities for action are, and inform the research, planning, and strategizing processes. The indicators themselves don't tell us what to do -- they are descriptive, not prescriptive. They do tell us where we need to do something, and we suggest that indicators be accompanied by planning processes to determine what to do about the indicators that fall short of our desired expectations.

Plans require action, which is the next step in the model. If we can act ourselves, we do so; if we need to convince others to act, then advocacy is required to get the desired actions.

Actions have consequences; the outcomes or results of those actions then need to be assessed to see if they achieved the desired results. Here is where our indicators come into play again -- are we closer to where we want to be? Based on the indicators, we can determine if we need to reshape our vision, adjust what we're measuring, or go back to the drawing board and develop new plans.

Indicators play two critical roles in our model for community change -- they identify priorities for action, and they assess the results of that action. In order to measure the effectiveness of our indicators, then, we measure how well they serve both of those functions.

This isn't the only possible theory of change, of course. Yours might be quite different. But detemining indicator effectiveness has to include some thinking about the model you're using in applying those indicators. Why are you measuring indicators? What difference do you want your community indicators to make?

That moves us from our two key questions to our three meta issues: design, timing, and source.

By design, I mean simply presenting the information so that your intended audience can use it to achieve the intended results. We don't think about design that way, I'm afraid. We look at what looks cool, what our peers are accomplishing, and what we like to see. We want to present our data in the most impactful way possible -- but many times, we're thinking about what is most impactful to us. And we tend to be different than our targeted audiences.

Elected officials, for example, tend to want the information presented clearly on one printed page in their hand when they need it. Researchers want more detail. Grantwriters need different kids of data break-outs. Regular citizens need something that's not so intimidating and doesn't make them feel like they're back in math class. Your design has to meet the needs of your audience in a way that allows and encourages them to use the information to achieve the desired outcomes. (On the webinar, I shared a quick succession of a series of indicator reports, both print and web-based, to show the wide variety out there. If you've been reading this blog, you've seen the examples and many more. Not every report needs to look alike -- but to work, they have to meet the intended audience where they are!)

By timing, I mean three things: time of year, update frequency, and data relevancy. The report needs to coincide with the decision cycles it hopes to influence, and the information in it needs to be current enough to influence action. For example, one of our intended audiences is our local United Way's resource allocation team. They need the information in our report to inform their decisions in allocating money to different programs. The report needs to be available before they meet, but not too far before they meet because the information in the report needs to be as current as possible. They make decisions on an annual basis, so to institutionalize the indicators in the decision-making cycle the indicators need to be updated annually. If your indicators are out of sync with your intended audience, they won't be used to achieve your intended results -- they become an interesting curiosity, not a decision necessity.

By source, I'm talking about who you are as an organization. When you publish your indicator report, is it seen as trusted information from a trusted place? Take a moment for some painful introspection. In general, data from advocacy organizations are not trusted by people without a shared belief in the cause. If your mission is to tell people to put children first, and you issue a report with indicators in it that say children should come first, your organization values will cloud the usefulness of that data. Your indicators will not be used by people who don't already believe children should come first.

How open and transparent is your indicator selection process? Who determines which indicators are chosen? Does the community know why you're measuring what you do? How open and transparent is your data review process?

Sometimes we have to choose our role in the community. It is remarkably difficult to be the trusted neutral source for information AND the community advocate for a single position. It almost never works to try to be both.

Once we have dealt with these issues, we can look at how we measure ourselves and the effectiveness of our indicators. There are at least five different areas in which we can look at effectiveness:
  • Explicit use of indicators in information sharing. By this I mean the number of times your indicators are used by other people (media, public officials, other organizations, your intended audience) in talking about the issue. For example, we have been able to track not just the media coverage of our report releases, but the way the indicators have been used over the course of a year to talk about issues, to justify positions, or to advocate for a cause. If the intended result is to raise awareness, you can track how the indicators are being used for that purpose and how often your reported is cited, linked to, or quoted.
  • Explicit use of indicators in decision-making. We find in whereas clauses and in public debates the use of our indicators in making key decisions. Sometimes we are asked to present the data to a decision-making body. Sometimes the indicators are cited in justifying decisions. Sometimes people will come to us and thank us for having the indicators available which helped them prevail in a political decision or in receiving a grant. If your intended result is to influence decision-making, track these. We also survey our intended audience and ask them about how they have used the indicators in their decision-making.
  • Institutionalization of indicators in decision-making. This is where the process of making decisions are built with the data report in mind. This is an important outcome we work towards. This can include policy and budget decision-making, but it can embrace many other things. Our local Leadership Jacksonville program builds its curricula for its four leadership programs with our indicators in mind -- all participants receive a copy of the report, and they are encouraged to use the indicators to better understand the community. Think about who you want to use the indicators, and in what fashion, and then help them design their processes with the indicators as a fundamental/necessary piece of that process. Remember the issue of timing!
  • Cross-disciplinary/cross-institutional priority-setting and collaboration around identified issues. Your indicators can help set the community agenda. What priorities have you identified? Who has embraced those priorities? More importantly, who has stepped out of their silo or comfort zone to step up to a shared community priority identified by your indicators? In our case, we pay attention when the indicators are used by our Chamber or Mayor to tackle an issue that's not traditionally their focus or responsibility, or when multiple groups join together in a common cause identified by the indicators. That's a desired result, and we note that activity.
  • Improvements in the indicators themselves. Your measure indicators that you want to improve. They're important, or else you wouldn't measure them. Our model for community improvement demands that we pay attention to what the indicators are telling us -- are we moving closer to the desired goals? If none of your indicators are getting better over an extended period of time, then your report isn't being effective in motivating change.
That's a summary of what we talked about in the webinar. I'm interested in your comments and suggestions to continue to conversation.

Read more ...

Tuesday, June 23, 2009

Job Opening: Research Analyst/Web Communications Specialist

MAPC’s Data Services group has an opening for a Research Analyst/Web Communications Specialist. The Data Services Group seeks to utilize information, technology, and tools to inform public policy and drive social change.

Research Analyst / Web Communications Specialist

The Boston Metropolitan Area Planning Council seeks a Research Analyst/Web Communications Specialist for the Data Services Group. The Research Analyst will collect and organize data, conduct analysis, and prepare data for presentation in print and electronic formats, as well as help maintain a public data website. Applicant should have excellent analytic and communications skills and knowledge of emerging Web 2.0 technologies.

This is an opportunity to work in a dynamic, interdisciplinary environment focused on using data and analysis to support regional planning and policymaking.

Primary Responsibilities

  • Data collection, management, and analysis: Research and obtain updates to MAPC’s existing data sets as they are released from public agencies, the private sector, and other MAPC units. Clean and format data sets and prepare summary reports. Research and identify new data sources and integrate them into MAPC’s data warehouse. Build relationships with other agencies and allied organizations to support ongoing data-sharing.
  • Manage MetroBoston DataCommon website: Maintain HTML component of MAPC’s online data viewer and web mapper, prepare monthly newsletter, and conduct user trainings.
  • Lead innovative applications of technology to planning: Keep abreast of best practices for the use of technology in planning and make recommendation for implementation by MAPC. Technologies include, but are not limited to, Internet mapping, data visualization and analysis, public participation technology, and social media.
  • Other tasks: Respond to email and telephone data inquiries from municipalities and allied organizations. Create databases, forms, and reports as needed by MAPC staff. Assist Data Center staff in presentation of reports and other information, both oral and written.
  • Perform other duties as necessary

Experience in conducting independent research and in the creation and maintenance of various databases. The following are required qualifications for the position:

  • A bachelor’s degree in planning, public health, economics, computer sciences, or a related field or at least 3 years of experience in a related field.
  • Excellent written and oral communications skills, especially for emails, websites, briefing papers, and technical documentation.
  • Demonstrated strategic and analytical capabilities, capacity for innovation, self-motivation, and goal-orientation.
  • Experience with management of databases.
  • A high proficiency with Microsoft Access, ESRI ArcMap, and HTML.
  • Experience working with diverse data sets from federal, state, and local agencies.
  • Knowledge of emerging Web 2.0 technologies, including social networking and geospatial websites.

The following are additional desired qualifications for the position:

  • Master’s degree in planning, public health, economics, computer sciences, or related field.
  • Experience with open source programming and open standards.
  • Experience with scenarios modeling, visualization, and 3-D modeling.
  • Experience with the creation of population and employment projections.


The MAPC Data Services Group is responsible for data management, data analysis, and policy research. The Data Center has a four-part mission:

  • Provide data analysis, mapping, and policy analysis support to all other departments within the agency.
  • Provide basic analytical services and data products “on demand” to municipal clientele and allied organizations and agencies.
  • Undertake customized research projects or studies for external clients.
  • Conduct independent research on emerging planning issues of regional significance to help educate public and private stakeholders about regional trends and policy alternatives.

Over the past few years the Data Center has:


· The Metropolitan Area Planning Council is a regional planning agency serving the people who live and work in Metropolitan Boston. Our mission is to promote smart growth and regional cooperation, which includes protecting the environment, supporting economic development, encouraging sustainable land use, improving transportation, bolstering affordable housing, ensuring public safety, advancing equity and opportunity among people of all backgrounds, and fostering collaboration among municipalities.

· Our work is guided by our regional vision, "MetroFuture: Making a Greater Boston Region."(See Our Council members and staff work to advance this plan through technical assistance to cities and towns, data analysis and mapping, research, collective purchasing, public engagement, and advocacy for public policies that advance our mission.

· We welcome to our staff team intelligent, thoughtful and entrepreneurial professionals who are committed to improving the quality of life in Metro Boston.

Salary range: mid- to high-$40’s. Excellent state employee benefits package.
Position open until filled. Review of applications will begin on July 6.
Interested candidates should submit a cover letter, resume and three references to
Email responses strongly preferred. Only applications with cover letters will be reviewed
MAPC is an EOE/ AA employer.

Read more ...

Wednesday, June 17, 2009

Free Access to Child Indicators Journal

Free Access to Complete Child Indicators Research Journal

Dear Researcher, Editors-in-Chief Asher Ben-Arieh & Bong Joo Lee have selected the following articles recently published in Child Indicators Research to keep you up to date with important developments on how child indicators can be used to improve the development and well-being of children.

You can read, download and save these articles as if you were a subscriber. The complete Journal is available online for free until July 31, 2009.

Editors' Choice Articles

Read the complete Journal: Click here

Enjoy reading.

Regards, Jasper de Vaal
Product Manager Human Sciences

Read more ...

Monday, June 15, 2009

Job Opening: Project Director

I'm relaying this from the NNIP listserve. For full details and to apply, visit Transtria's web site at

Transtria is a certified, woman-owned, small public health research and consulting company with a vision of uniting people, places and policies to revolutionize public health. Transtria specializes in providing leadership and technical assistance to support the development, implementation and evaluation of research and practice-based projects. Transtria works with clients and community partners to understand their needs, create a collaborative process for engagement, identify strategies for change, monitor progress, document outcomes and share findings with others.

We are seeking a well-rounded leader to help guide the direction of our new Healthy Kids, Healthy Communities project and to play a key role in the ongoing growth and development of our organization. This following capacities will serve as the basis for review of the candidates:

· Lifestyles, behaviors and health: knowledgeable about nutrition, physical activity and obesity and associated ecological factors contributing to these outcomes

· Health inequities: commitment to prioritizing communities marginalized by social, economic or environmental determinants of health

· System, policy and environment change: understanding of multidisciplinary approaches to create community and organizational changes to impact health behaviors and health outcomes; insight into the processes, assets and challenges associated with planning and implementing these approaches

· Research and evaluation: facility with methods and measures to assess and evaluate systems, policies, environments, practices, promotional and programmatic activities, behaviors and health; qualitative and quantitative data collection, management and analysis

· Collaboration and partnership: experience with community engagement and relationship building; competence in directing or facilitating communications with a diverse array of community partners (e.g., policy-makers and elected officials, community leaders, community-based organizations, coalitions, advocacy groups, or representatives from local government agencies – health, planning, transportation, parks and recreation, community and economic development)

Please see the attached position description for a list of essential functions and minimum requirement of the position. Applications are available on our website:

Read more ...

"Greenest Cities" Indicator Sets

Here's a nice round-up of measures of sustainable or "green" cities from a number of sources. It's from Elizabeth Barrette on Gaiatribe, which is a blog "support[ing] the premise that humanity is the part of the biosphere that can think, which gives us the responsibility to care for life and the Earth. Here you'll find discussions of renewable energy, sustainable development, intentional community, nature religions, environmental awareness, and related topics."

The list provided is a quick overview of a number of resources and indicator sets to look at urban sustainability. Take a look, and let me know -- which other resources should have been added to that list?

Read more ...

Greater Louisville Project

The Greater Louisville Project has released their 2009 Competitive City Report (PDF), and I commend it to you for several reasons.

First, I want you to look at the amount of information the report conveys in 7 pages. They have historical, current, and projected data over a number of indicators that show Louisville's place among its peer cities. (I was particulary interested because Jacksonville is one of those peer cities selected -- the more research others do about Jacksonville, the easier my job becomes!)

Second, the report is focused on a single goal -- "to move Louisville into the top tier among its peer cities by the end of this decade." To do that, they identified three "deep drivers" -- Education (specifically raising educational attainment to develop a more highly skilled workforce), Jobs (21st century industry and jobs), and Quality of Place (which deals with the urban vitality required to attract talent and the 21st century industry.) Each of the indicators then become linked to one of the drivers and linked directly to the goal.

Third, the mix of graphics and use of color makes the report engaging. You get the picture quickly.

Fourth, the website is engaging. You get the overview, and then can click through any of the three deep drivers, and then go to any chart, and then get an Excel data chart for each indicator -- I like the way they meet the information needs of multiple audiences.

The last thing I'll mention is that I liked how Jacksonville was improving -- our rates of progress, specifically in educational attainment and household income, have been a concentrated local concern, and I'm glad to see other people taking notice of our growth in these areas. Plus check out Jacksonville's parkland indicator! Hooray for preserving land as a recreational asset, environmental protection, and growth management tool!

Anyway, check out the report, and keep the new report release information coming.

Read more ...

Saturday, June 13, 2009

Indicators Report: Whatcom Counts

Elizabeth Jennings, Executive Director of the Whatcom Coalition for Healthy Communities in Whatcom County, Washington (Bellingham area) sent me a note to look at their indicators website, Whatcom Counts.

The focus of the website is on community health indicators. They do some nice things with the site that I think community indicators practitioners should pay attention to.

First, they use dashboard images well to identify where the indicators are against a scale of where they ought to be. The visualizations appear as mini-dashboards, gauges marked off in red, yellow, and green, so that the message is easily recognizable, even without any numbers attached. Clicking through the image gives you the current value, the values for red/green/yellow, and a mouseover gives you a legend that explains if the comparisons are made over time, by average, or by region.

Secondly, clicking through the indicator gives you more detailed information, including what the indicator measures, why it is important, technical notes, source information, the URL of the source, the URL of the data, and a graph of time-series data. I appreciate this attention to the metadata needs of the end user, and applaud how they've taken a simplified image and combined it with more detailed data to meet needs of multiple user types.

Third, I like the grouping of the indicators by topic centers (with a broad vision of what constitutes community health), as well as providing the option to search for individual indicators. The interface for finding indicators appears intuitive and easy to use.

Fourth, i like the bringing together of multiple information formats -- indicators/statistics, reports/data, and current news. The inclusion of a "feature story" on the main page provides ways to engage with the information for those who are not comfortable extracting meaning by themselves from the data, and provides a context for understanding how the indicators interact with key community issues.

Last, I like the way the user is given options to Click -- Learn -- Act. The Act section is interesting; they've provided the following links:

ACT - Get involved. Apply information and ideas to an issue that is important to you.
Promising Practices - Find solutions
Local Resources - Get involved locally
Contribute Content - Submit content

I think this is an exciting presentation of information. Elizabeth writes:

"Since launching in 2006, we’ve been piloting how to use the site as a tool in our community convening work, not just a static source of information. Our traffic has grown from about 1,500 visits per month at the beginning of 2008 to over 9,000 in January 2009. "

I think you'll agree that they're doing some remarkable work. Take a look and let me know what you think.

And keep sharing with me your indicators projects! We all benefit when we can learn from each other.

Read more ...

If It Matters, Measure It!

I wanted to pass along this blog post from Scott Burns highlighting the importance of government performance measurement and pointing to the CDC's metrics page as a best practice example.

Along the same line, I stumbled across this blog called Measures Matter that is focused on government performance measurement.

Of course, i still rely on the resources at the Public Performance Management Reporting Network and the Association of Government Accountants (AGA) Blog.

What other resources do you find helpful for discussing government performance measures?

Read more ...

Friday, June 12, 2009

Webinar: Are your community indicators making a difference?

I'm going to be putting on a Webinar for the Community Indicators Consortium on Thursday, June 25, 2009, from 12:00 PM - 1:00 PM EDT. This one is just for the members of CIC, so if you'd like to attend but are not a member, please consider joining.

The title of the webinar is: "Are your community indicators making a difference?"

This will be a conversation about how to measure the results of your community indicators project (CIP) and the key design elements that increase a CIP's effectiveness and impact. Here's what they're saying about the webinar:

Ben Warner, Deputy Director of JCCI, will lead the dialogue, beginning with his insights learned from JCCI's 25 years experience in annual community indicator reports. Bring your questions and your experience to the table June 25th from 12-1pm EDT.

After you register you will receive a confirmation email with information about signing into the meeting.

Note that CIC Lunchinars are currently only open to CIC members. If you are interested in becoming a member, see the Join CIC section of our web site at

I hope you can join us.

Read more ...

CIC Extends Time for Proposals

Community Indicators Consortium
2009 Conference Call for Proposals
Deadline Extended to June 30th

The deadline for submitting proposals for CIC's 2009 Conference, to be held October 1-2, 2009 at the Meydenbauer Center in Bellevue Washington, USA, has been extended to June 30th.

The theme of this year's conference is:
"Community Indicators as Tools for Social Change:
Tracking Progress and Increasing Accountability."

We are looking for proposals for presentations, panels and speakers for the conference dates of October 1-2. We are also looking for proposals for optional training workshops available, for an additional fee, on September 30th.

Interested presenters can see the pdf of the complete call for proposals and submit proposals on the CIC web site. See our Seventh Annual International Conference page for more information and to register. For sponsorship and exhibition opportunities, see our Sponsorship page.

Also, please note that June 15th is the end of the special discount registration rates -- $250 for members and $300 for nonmembers -- for the conference. After that registration for the conference will be $300 for members and $350 for nonmembers. See our on-line registration page for more details.

We are very excited about how the conference is shaping up and look forward to seeing you in the fall.


Maureen Hart
Community Indicators Consortium

Read more ...

Thursday, June 11, 2009

What Saving $100M in a Federal Budget Looks Like

We've talked about telling stories with big numbers. We've even talked about using art to project the emotions around big numbers.

Now here's a video that tries to take some really big numbers in the Federal budget and make them relevant.

Any questions?

More visualizations available at 10000 Pennies on YouTube. Note: the presenter has a political viewpoint. That's not the point. The purpose of sharing the video is to show how effective visualizations can convey the message of big numbers in a way that advances the message, whatever that message might be.

Read more ...

Oregon Benchmarks: A Wake-up Call

A well-established, internationally-recognized, model community measurement project could lose funding and go away because the people who should be using the information have been ignoring the data. Today's headline is brought to you from the Statesman Journal reporting on the Oregon Benchmarks project, but the message resonates loudly with other communities who have been through or are potentially facing the same short-sighted decision-making under pressure of tight budget demands.

From the Statesman Journal:

Oregon has been a global leader in measuring its successes and failures. Unfortunately, too few people have been paying attention, especially in state government and the Legislature. Now the state is on the verge of ending its innovative self-grading system, known as the Oregon Benchmarks. Fortunately, there's a proposal to keep the benchmarks alive ...

Take a look at the Oregon Benchmarks here. It's a good report. It ought to be used more. It deserves better.

But if something that's been in place for 20 years is under such budgetary pressure, what level of concern are we facing in our own local communities? How are we insulating ourselves from these kinds of financial threats? What are we doing to establish the ROI of our projects? How are we diversifying our funding streams?

Read more ...

Wednesday, June 10, 2009

JCCI Releases Report on City Government Finances

The Jacksonville Community Council Inc. (JCCI) has released a new study called Our Money, Our City: Financing Jacksonville's Future. The report is a result of eight months of work of a volunteer citizen task force exploring local government finances.

I share this with you because one of the conclusions of the report speaks to our ongoing conversations about government performance measures. From the report:

Jacksonville lacks the kind of transparent performance management measurement and benchmarking systems necessary to demonstrate efficiency and effectiveness in
government, and to build confidence in government’s stewardship of taxpayer money. Therefore, many residents of Jacksonville do not trust government to spend their tax
money wisely.

The report recommends that:

Based on a clearly stated set of roles and priorities for City government, the Mayor and City Council should establish and make publicly available explicit financial and operating goals together with appropriate performance standards (i.e. “benchmarks”) for use in evaluating actual performance. This level of accountability is critical for providing citizens with a tool to help determine the effectiveness of local government as well as the efficient use of taxpayer dollars. As part of developing this performance measurement system, the City of Jacksonville should participate in state and national organizations, such as the Florida Benchmarking Consortium, which allow cities to compare performance in key areas with other cities known for delivering high levels of service with high efficiency.

At the study release press conference, attended by over 150 community leaders and interested citizens, study chair J.F. Bryan IV said, "We don’t have the kind of transparent performance management, measurement and benchmarking systems needed to adequately demonstrate efficiency and effectiveness. You can’t manage what you don’t measure."

The early press reports of the study focus on the current financial crisis being experienced by the City and the hard choices between raising revenues and/or cutting services that Jacksonville must make. But central to making those decisions is the need to build public trust, and the development of transparent performance measures is seen as a critical component to developing that trust.

I'll keep you informed as the conversation continues ... this promises to be an interesting opportunity to transform the relationship between the people and government. I thought this group might want to think about the connections between community indicator systems and government performance measures that are designed for public trust, not just internal management efficiencies.

Read more ...

New Indicators Report: Rochester, NY

ACT Rochester (, a partnership of the Rochester Area Community Foundation and the United Way of Greater Rochester, has launched a new community indicators website with "a wealth of information for everyone from policy makers to parents," according to Jennifer Leonard, Executive Director of the Community Foundation.

From ACT Rochester's website:

The mission of ACT Rochester is to stimulate community solutions to our most critical challenges by changing the culture of public discussion and debate. This will be achieved through focused, independent and objective measurement of key community indicators, through diverse and timely dialogue and by promoting results-oriented actions.

We hope that the comprehensive data and other information contained here will serve as a focal point for formal and informal community forums and inspire you and others in the community to share comments and participate in polls, which will be added to the Website in the coming months. In addition, we will be scheduling a variety of community discussions and activities beginning in the fall of 2009.

The name ACT Rochester urges action, specifically in response to the issues highlighted by the data. The name also stands for Achieving Community Targets, which signals the potential to establish specific targets or goals for improvement. These targets will be the result of the community involvement process, and will form the future development of ACT Rochester.

ACT Rochester currently covers a seven-county region: Genesee, Livingston, Monroe, Ontario, Orleans, Wayne and Wyoming. This Website contains indicators, analysis of trends, summaries of community efforts to address issues and numerous listings of community resources.

It's a nice website -- the structure is easy to navigate, the indicators are clear and easily readable, the explanatory notes (What does this measure?, Why is this important?, How is our region performing?) well-written, and the options to customize the chart or download the data as a *.csv file are greatly appreciated.

The site has a wealth of information, and it's worth taking a look. Nice job!

Read more ...

Tuesday, June 9, 2009

Audio Conference on Government Performance Measures

The Association of Government Accountants (AGA) is providing an audio conference called Using Performance Measures to Manage Government Services on July 15, 2009. It's a little pricey, but I thought I'd pass it along. From their website:

AGA is pleased to announce its latest audio conference on using performance measures to improve operations.

Accountability. Transparency. Measurement. These are the buzzwords of the day, but the expectations are real that citizens are expecting governments to be efficient and effective. Collecting good data does not improve government operations, however. What improves operations and service delivery systems are the discussions, analyses of the information, and point/counterpoint debates that focus directly on making changes. A new research project, Using Performance Measures to Improve State and Local Governments’ Service Delivery, will determine the extent to which performance measures are being used in government to improve services, regardless of the level of government. The project will follow up with successful practices in applying these techniques to achieve performance goals and provide quantifiable value.

Join Harold (Hal) I. Steinberg, CGFM, CPA, lead researcher and former first Deputy Controller/Acting Controller, Office of Federal Financial Management, Office of Management and Budget, and two government experts who will share their failed attempts and successful experiences.

Please join us for two hours of lively discussion about this important and timely topic. In addition to the speakers’ commentary, 20 minutes will be set aside so participants can ask the speakers questions and share their own experiences.

Date: Wednesday, July 15, 2009

Time: 2 – 3:50 p.m. Eastern Savings Time

Learning Objectives: To learn how the use of performance measures can improve the efficiency and effectiveness of government

Prerequisite: Basic knowledge of performance measures

Advance Prep: None required

CPE: Two credits

Field of Study: Management Advisory Services

Cost: $249 per site (UNLIMITED ATTENDANCE) if you register on or before Friday, July 10, 2009 and $299 thereafter. SPECIAL PROMOTION: Government agencies and CPAG members who register five or more offices will receive a 20 percent discount ($200 per site)

Read more ...

Monday, June 8, 2009

CSIN Conference: Accountability through Measurement

The Canadian Sustainability Indicators Network (CSIN) is hosting the 2nd National CSIN Conference in Toronto from March 2 to 5, 2010 (Mark your calendars!) In conjunction with that conference, they have an extended a Call for Papers -- submit a short abstract by July 15, 2009. More information is available at CSIN.

The conference looks pretty exciting. Featured guest speakers include Enrico Giovannini, who is fun to listen to and will be able to report on what happened at the OECD’s Charting Progress, Building Visions, Improving Life World Forum in South Korea on October 2009, and Hazel Henderson, who is absolutely fabulous and one of the really amazing thinkers about future sustainability and the key role indicators can play in redefining what's important. (And I double-checked -- you don't have to be Canadian to attend!)

Here's what Christa Rust, CSIN Coordinator, has to say about the conference:

It will showcase an exciting, thought-provoking array of plenary and parallel sessions on vital topics all geared towards strengthening accountability through measurement. The Conference will give delegates the chance to connect with special guest speakers, CSIN members, fellow measurement and reporting practitioners, and to actively participate in a wide range of solutions-based, practical learning workshops. You’ll discuss everything from Accountability through Measurement and the growing role of community indicators to the positive effects that indicators systems and evidence-based decisionmaking are achieving in the world, today.

As many of you are aware, CSIN is housed in the Measurement and Assessment program area at the International Institute for Sustainable Development (IISD) Head Office in Winnipeg. We connect the Institute with international, national, provincial, regional and local communities. Since 2003, our Network has played a key role in the progress toward sustainable development by advancing best practices in measurement and indicator systems across Canada and beyond.

Now as our membership nears 1,000, next March is an ideal time for us to meet and talk, share ideas and insights and look to the future. Accountability through Measurement is our theme. Why? Because the question of how one accurately gauges progress towards social, economic and environmental sustainability–is one of the most profound governance challenges of this century. The focus of our 2010 conference: Measurement Solutions at Work. Our workshops and plenary discussions are geared for professionals eager to learn practical techniques that demonstrate how to develop and evaluate indicator systems and put them to use wherever they work.

2010 Conference themes are: Strengthening Sustainable Development Governance through Measurement, Getting the Message Across – Innovation in the Visualization of Indicators, From Information to Influence–Incorporating Indicators into Decision-Making, Experiences and Best Practices, and The Business of Sustainability Indicators.

Designed to peak your interest, spark lively discussions and directly benefit you in your workplace, workshops will be recorded and available after the event – for a nominal fee – to all CSIN members.

Don’t miss our Special Guest Speakers: Enrico Giovannini of the Organization for Economic Cooperation and Development, and from Ethical Markets – author, futurist and syndicated columnist – Hazel Henderson.

Read more ...

Wednesday, June 3, 2009

CIC Newsletter Available

The Community Indicators Consortium has sent out a new newsletter that you will want to read. Go to to see more. The newsletter includes:

You can go to CIC's newsletter page to read past newsletters (and this one as soon as it is available in PDF), or you can do what I do and subscribe to the newsletter -- send an e-mail to ED [at] to get on the mailing list.

Read more ...

Videoconference: Education and Health Disparities

Here's a press release of some interest. The indicators we use often deal with this intersection of educational and health disparities. This videoconference should be really good. 

15th Annual Summer Public Health Research
Institute and Videoconference on Minority Health


Please join 1,000 other participants in 45 states and 6 countries for the 15th Annual Summer Public Health Research Videoconference on Minority Health.

When? Tuesday, June 9, 1:30-4:00pm EDT
Where? Webcast, C-band satellite, and Tate-Turner-Kuralt building auditorium (at UNC Chapel Hill) - see

Topic: "Breaking the Cycle: Investigating the Intersection of Educational Inequities and Health Disparities"


Reginald Weaver, Vice President, Education International; Past President, National Education Association

Dina Castro, M.P.H., Ph.D., Scientist, UNC FPG Child Development Institute, University of North Carolina at Chapel Hill

Nicholas Freudenberg, Dr.P.H., Distinguished Professor, Program in Urban Public Health, Hunter College School of Health Sciences/City University of New York

Lillian A. Sparks, J.D., Executive Director, National Indian Education Association

Moderator: Howard Lee, M.S.W., Executive Director, North Carolina Education Cabinet

This interactive session will be broadcast with a live audience in the Tate-Turner-Kuralt auditorium at the UNC School of Social Work and can be viewed over the Internet (webcast) and c-band satellite. Questions will be taken from studio and broadcast participants by email and toll-free telephone.

For more information:

To view on your personal computert:

To view with a group:

To register a viewing site:

To register for the studio audience at the TTK auditorium:

(Note: If you have registered for the Videoconference you will receive an email by 6/4 reconfirming your registration and giving specific information for receiving or attending the broadcast.)

To read the abstracts, agenda, and speaker biographies:,,

To download materials (publicity, slides when they become available, attendance sheet for group sites):

Answers to frequently asked questions:

This year's Videoconference is presented by UNC Diversity and Multicultural Affairs and the UNC Gillings School of Global Public Health Minority Health Project (ECHO). Funding comes from the Dean's Office of the UNC Gillings School of Global Public Health, UNC Diversity and Multicultural Affairs, UNC FPG Child Development Institute, UNC Center for Development and Learning, Counseling and Wellness Services (UNC Campus Health Services), NC Health Careers Access Program, Sheps Center for Health Services Research, UNC American Indian Center, UNC Center for Health Promotion and Disease Prevention, UNC College of Arts & Sciences, and a growing list of cosponsors ( This activity is supported by an educational donation provided by Amgen. Please thank them - and consider joining them or providing an endorsement.

Vic Schoenbach (
Director, Minority Health Project
[Please support the Minority Health Project!]
UNC Gillings School of Global Public Health

Cookie Newsom
Director of Diversity Education and Research
UNC Office of Diversity and Multicultural Affairs

* 20+ health disparities-related broadcasts and seminars are available as on-demand webcasts at

Read more ...

Tuesday, June 2, 2009

New Indicators Report Underway in Issaquah, WA

The City of Issaquah (located just east of Seattle) has launched a new Issaquah Sustainable City Indicators project. The City convened a panel of community leaders they called the "Sustainability Sounding Board" with the charge to "help weigh the interconnected issues of sustainability and provide recommendations on ways to measure progress toward sustainability in our community."

The Board has issued its report, available in PDF format here. They call on the City to measure a set of sustainability indicators to assist in planning and policy-making and to educate the community. The ChamberPost blog says that:

The City's Resource Conservation Office (RCO), along with staff from Planning and Economic Development, will now set tangible goals for each of these indicators, as well as collect the data needed to actually measure Issaquah's progress. The City's first "report card" is scheduled to be released a year from now.

The report is of considerable interest, as it explains the process and the reasoning the group used in deciding which indicators to measure. I like what the report says:

The Sounding Board recognizes that our work is a starting point to a more ambitious goal of taking Issaquah’s sustainability leadership to a higher level. The intent of this project was to establish a set of measures to track what matters to Issaquah – to measure progress towards a sustainable vision. However, the important next step is to use the tool to communicate and motivate success. Putting our recommendations into action requires additional work with community partners to further refine the indicators and collect data, establish benchmarks and goals, communicate progress, and inspire action across the Issaquah community. Specifically, taking our recommendations forward involves:

Refining some indicators; beginning data collection. Several of the recommended indicators will need further refinement with the help of community partners to design the most meaningful metrics, as well as to acquire the necessary data. In many cases, data are already available. In others, new work to collect data will be needed in order to effectively track progress.

Reporting and communicating sustainability progress regularly. Communicating sustainability progress across city departments, with the City Council, and to the broader community on a regular basis is critical to spreading awareness, creating community collaboration, and driving positive change.

Take a look! I'm really looking forward to their first indicators report sometime next year.

Read more ...

Profile Piece on "Data Pro" Kurt Metzger

We like it when community indicators practitioners get the recognition they deserve from their communities. Kurt Metzger, the Detroit-Area Community Indicators System director, was just profiled in his community encouraging better decision-making through data.

(You probably remember Kurt from this January profile as well.)

The media can be our must useful allies in sharing information with the community. It's nice when they treat us well. Congrats again, Kurt.

Read more ...

Monday, June 1, 2009

Report from Bhutan: GNH v. GNP, by Junko Edahiro

In a continuing effort to bring you conference report updates from conferences we didn't (couldn't) attend, here's Junko Edahiro's report from the Fourth International Conference on Gross National Happiness in Bhutan last November. The report is coming to you straight from the folks at Japan for Sustainability, an organization I highly recommended to you back in 2007 and whose indicators report sets a new standard for engaging the community.

Report from Bhutan: 
Gross National Happiness (GNH) Versus Gross National Product (GNP)

Following the reports on China in our November and December issues in 2008 as part of our evolution toward "Asia for Sustainability" (AFS), JFS co-founder Junko Edahiro reports on a meeting she attended in Bhutan, the Fourth International Conference on Gross National Happiness (GNH).  It was held in Thimphu, the national capital, from November 24 to 26, 2008.

Gross National Happiness (GNH) is an index used to measure national strength and progress based on people's happiness, rather than on levels of production as measured by gross national product (GNP) and gross domestic product (GDP). This concept was described in 1976 by Bhutan's fourth king, Jigme Singye Wangchuck, as "more important than GNP," and was chosen as the nation's primary development philosophy and its ultimate goal of development.

See also: GPI, GNH, GCH: True Indicators of Progress

The first International Conference on GNH to promote the concept and creation of a GNH index was held in Bhutan in 2004, the second in Canada in 2005, and the third in Thailand in 2007.

The theme of the fourth International GNH Conference was "Practice and Measurement," indicating a step forward into a new phase focusing more on how to reflect GNH in policies, and how to grasp the current situation and measure progress, rather than considering GNH as simply a principle or philosophy.

A total of 90 people from 25 countries attended the conference, with about 10 from Japan, which was the second highest number of attendees next to Bhutan. In the morning of the first day, the organizer gave opening remarks after a ceremony conducted by Bhutanese monks, then H. E. Jigmi Y. Thinley, the first prime minister of Bhutan following the country's shift to a democratic parliamentary system, gave the keynote address.

The prime minister touched upon the words of King Jigme Khesar Namgyel Wangchuck, Bhutan's fifth king, who, in his coronation address the week before the conference, clearly said that the "promotion of GNH was his responsibility and priority." The prime minister in his speech also repeatedly said that GNH lies at the foundation of Bhutan's national policies. He also noted that, while most believe economic growth is necessary in order to alleviate poverty, "to believe this is to believe in killing the patient in order to cure the disease. Even the justification for economic growth for poverty alleviation seems very shaky, unless we radically improve redistribution."

After the opening ceremony, the general meeting began, and one of the highlights was the announcement of the GNH Index by the Centre for Bhutan Studies. The idea of GNH is well-known, but how can it be measured? This is what the world wants to know today.

Bhutan has made four pillars of GNH the basis of its major governance principles: economic self-reliance, a pristine environment, the preservation and promotion of Bhutan's culture, and good governance in the form of a democracy. Nine dimensions support the four pillars: living standards, health, psychological well-being, education, ecology, community vitality, time use, culture, and good governance.

This time, in order to gauge the progress of advancement with the four pillars of GNH, 72 variables were selected to correspond to the nine dimensions, and a national survey was carried out. At the conference, a researcher from the Centre for Bhutan Studies presented the types of variables selected and an overview of the survey findings. Participants from other countries also gave papers on their studies and practices to measure happiness, which led to some lively discussions.  (See also the Gross National Happiness website, operated by the Centre for Bhutan Studies, for the GNH Index and the survey results, at

While the Bhutanese government actively promotes GNH, this does not mean that it guarantees the people's happiness. It simply promises, as the nation and/or the government, that it will work to create the conditions under which individuals can seek GNH.

During the conference, one Bhutanese participant said, "Bhutan should build its own GNH-based economy, politics, and culture. Considering GNH, it is clear that even democracy is not an end. Democracy is a means of good governance necessary for GNH." From such comments, I could sense a move to start considering GNH as a foundation of nation building, not just as a concept or an index, as many people think.

Obviously, Bhutan is not a utopia just because it advocates GNH. For example, in many areas of the country, infrastructure such as adequate water supply has yet to be developed well enough. The country also has many other problems related to modernization, particularly growing concern about an increase of juvenile crime and other social problems since the introduction of television.

In addition, even the term "GNH" is not specifically mentioned in Bhutan's current tenth development plan. Later on, at the wrap-up session of the conference, a Bhutanese participant said that GNH should not be used to solve world issues but to solve national problems in Bhutan first.

The three-day conference was concluded with a strong message that putting GNH into the mainstream of Bhutanese politics will be a driving force in creating a more holistic society in the country. The next conference will be held in Brazil.


The fact that the prime minister gave the conference's opening speech, with many ministers and cabinet officials in attendance, is an indication of the importance the government places on GNH. On the third day of the conference, I sensed the essence of GNH at an event at a luncheon hosted by the king, to which I was invited together with all the participants from outside Bhutan.

Prior to the lunch meeting, the newly coronated 28-year-old king stood in front of the entrance of the palace, shook hands with guests one by one, exchanged words for a while, and welcomed them all politely. I myself shook hands and talked with him for a while. There was no hurry with him at all to meet individually with the several dozens of guests. He focused his entire attention on the here and now, serene like a calm lake. I sincerely felt that he cherished the time with me, and I was deeply impressed with this.

In regard to the character of the king, a person I interviewed who knows Bhutan well said, "At the coronation ceremony last week, citizens gathered from villages across the country to see him, even for just a glimpse, with many having made an overnight trip to get there. Tens of thousands of people stood in line. When people became impatient and were about to rush to him, the King took the microphone and said, 'I promise to shake hands with the person at the end of the line. Please wait.'

When the time was almost running out, the King started walking and shook hands with everyone up to the last person and exchanged words, instead of standing in place and waiting to greet the people there."

Also, as an example of the character of the former king, who established the basis of GNH, he administrated state affairs while living in a modest house, and when the Great Hanshin-Awaji Earthquake hit Japan, he prayed for three days without eating.

When I talked with Bhutanese people, I felt that they sincerely admire and respect the fourth king. And also I could feel that the fifth king, just as his predecessor, wants to cherish his people in earnest.

After greeting the conference participants, the king entered the luncheon hall alone, and lined up for food just like the rest of us. When he had his food, he seated himself at one of the tables and started talking with people around him while eating. Watching him, I realized in a true manner that the king embodies the essence of GNH as one that treasures his people one by one, as well as sensing the hearts of those who respect the king.

Setting the GNH Index itself is only a start. Creating an index and measuring progress is one thing, while the holistic idea that "there is something important in those unmeasurables" is another. The question is: How do these ideas get incorporated into their principle goal of making the Index useful for Bhutan and the rest of the world?

This is a very important process unfolding. I would like to watch its progress and promote the idea with like-minded people and groups around the world who think there is something more important in life than GNP and GDP. If you too are trying to measure or visualize something along these lines and want to change society by communicating it, JFS would really like to hear from you.

Written by Junko Edahiro

Keep these conference reports coming!

Read more ...

Exploring Wolfram Alpha

Wolfram Alpha is a new search engine that's a little different, and (if it survives the Yahoo/Google/Bing fight) may be of surprising use to those of us who play with information for a living.

Google dominates the search engine market right now, and its public data tool is a welcome addition. Microsoft's attempt to get into the game with Bing may shake things up a bit. But Wolfram Alpha is an intriguing source -- it bills itself as a "computational knowledge engine" and aims "to make all systematic knowledge immediately computable and accessible to everyone."

From their website:

We aim to collect and curate all objective data; implement every known model, method, and algorithm; and make it possible to compute whatever can be computed about anything. Our goal is to build on the achievements of science and other systematizations of knowledge to provide a single source that can be relied on by everyone for definitive answers to factual queries.

Wolfram|Alpha aims to bring expert-level knowledge and capabilities to the broadest possible range of people—spanning all professions and education levels. Our goal is to accept completely free-form input, and to serve as a knowledge engine that generates powerful results and presents them with maximum clarity.

They continue:

As of now, Wolfram|Alpha contains 10+ trillion of pieces of data, 50,000+ types of algorithms and models, and linguistic capabilities for 1000+ domains. Built with Mathematica—which is itself the result of more than 20 years of development at Wolfram Research—Wolfram|Alpha's core code base now exceeds 5 million lines of symbolic Mathematica code. Running on supercomputer-class compute clusters, Wolfram|Alpha makes extensive use of the latest generation of web and parallel computing technologies, including webMathematica and gridMathematica.

What does this mean? I plugged in "unemployment rate Duval County Florida" into Google, Yahoo, Bing, and Wolfram Alpha. Here's what I got (top result only):

Yahoo: Duval County, Florida Zip Code, Radio Stations, Crime Rate, Weather ...
Duval County, Florida (FL) Zip Code, Radio Stations, Crime Rate, ... Unemployment rate in 2004: 5.2% Average household size: Duval County: 2.5 people. Florida: ...  - Cached

Bing: Unemployment Rate: Duval County, FL, Florida; Percent; NSA
The best economic data site with over 100,000 series. Users have the ability to make their own custom charts, XY plots, regressions, and get data in excel files, or in copy ...​laupa12060003 · cached page

(which turned out to be exactly the same as the top result in Google -- still looking for a reason to use Bing ...)

Google: Unemployment Rate: Duval County, FL, Florida; Percent; NSA
The best economic data site with over 100000 series. Users have the ability to make their own custom charts, XY plots, regressions, and get data in excel ... - 30k -

(I forgot -- had to try a couple of times to get the formatting right for Google's public data site)  "unemployment rate duval" returned:

(some of the graph cut off when I grabbed the image -- that's my fault, not Google's.)

Wolfram Alpha, on the other hand, gave me this:

Then I got a little creative. I asked for "unemployment rate Duval County Florida/unemployment rate united states" and got this:

Now that's pretty cool. They'll grab data as data and do computations with it -- this is a real step forward to a semantic web.  So bookmark and see what other goodies it has in store. This is really neat stuff.

(Hat tip: L. Gordon Corvitz)

Read more ...