Community Indicators for Your Community

Real, lasting community change is built around knowing where you are, where you want to be, and whether your efforts are making a difference. Indicators are a necessary ingredient for sustainable change. And the process of selecting community indicators -- who chooses, how they choose, what they choose -- is as important as the data you select.

This is an archive of thoughts I had about indicators and the community indicators movement. Some of the thinking is outdated, and many of the links may have broken over time.

Friday, October 2, 2009

CIC Conference 2009, Part One

We're at the halfway point at the Community Indicators Consortium conference. The discussions we've been having and the quality of the presentations have been pretty good -- they've offered me quite bit to think about. Let me share some high points, with the caveat that I couldn't be in all of the sessions and I've already heard I missed some great ones.

Click the link to read my notes from Wednesday morning through Thursday evening.


The conference really began on Wednesday with a selection of pre-conference workshops. I'd like to thank those who braved a three-hour session with me Wednesday morning to talk about Making a Difference with Indicators: What You Need to Know. The conversation was an expanded follow-up of the June 25 webinar, and I'm going to be collecting the material and the comments into a more formal article soon. The only point I want to emphasize is the importance of intentionality -- what results are you trying to accomplish, who is your audience, what are the expected actions you want that audience to take, how does your design cohere to your explicit theory of change -- and the importance of openness to serendipity. In other words, target your efforts, measure your outcomes, and be prepared/design for expanded uses beyond your core. We'll talk more later.

Wednesday afternoon was a meeting of the Community Indicators-Performance Measures Integration Working Group, which is a diverse group of really smart folks trying to figure out best practices and effective techniques for bringing together two similar ways of using data to measure progress. We'll be making a couple of announcements about the progress of this group at today's sessions of the conference, so I'll wait to report on that. At the conference, a Maturity Model of integration is posted for public comment, and a revised version of that model (incorporating the feedback) should be available shortly after the conference, at which point I'll have something more to share and that we can talk about.

Friday morning's opening plenary was from Richard Conlin, of the Seattle City Council and co-founder of Sustainable Seattle. Take a look at what they're doing -- it's not the same organization that once pushed sustainability onto the national agenda, and the organizational transformation would be a fascinating case study. The cutely-named B-Sustainable initiative is going through a re-launch/marketing push. There's energy and good work happening -- the efforts to reduce waste, not just increase the percentage recycled, is a great example -- and Seattle provides a number of lessons learned for anyone working on environmental sustainability. (Side note: they're also an interesting example of how difficult it is to manage the tension between being the community's trusted data source and being effective advocates for a cause. The trade-offs are real and are worth thinking about in your own organization's strategic planning processes -- where will you shine?)

Then I joined Meriko Kubota and Lidia Kemeny of Vancouver Foundation's Vital Signs in a session entitled Partnering for Progress. I shared JCCI's model for community change, as seen below (click on picture to enlarge):


I also shared how this works in practice, following one indicator through the visioning/measuring/prioritizing/planning/advocacy/evaluation processes to show how the teen birth rate was slashed, going from nearly double the national rate to (based on 2008 preliminary figures) below the national rate. Indicators serve dual functions in a powerful method for creating sustainable community improvement.

Vancouver Foundation's Vital Signs are interesting. Canada's community foundations are doing really interesting work with indicators, and Vancouver's no exception. They're the largest community foundation in Canada. They first produced a Vital Signs report in 2006, and followed that with reports in 2007 and 2008. They're not releasing a 2009 update, but are preparing a 2010 report in conjunction with the Olympics.

There are 12 key areas measured in the report, and that framework is worth a look. I especially like the categories of "Belonging and Leadership" and "Getting Started". In 2008, the report focused on the differences between community perceptions and reality, illustrating the sections with fuzzy pictures (perceptions) and then with the pictures in focus (reality). The report tried to distinguish between what was happening and what people thought was happening in Vancouver.

These three survey questions are fascinating. Vital Signs asked:

  1. What is the single most important issue you would like to see addressed to improve the overall quality of life in metro Vancouver?
  2. Give an example of a specific event, action, or other thing that has improved the quality of life in metro Vancouver over the past 12 months?
  3. Over the last 12 months, what actions, if any, have you taken in your own life to make a positive difference in your community?

For question 2, 55 percent of respondents couldn't think of anything. For question 3, 26 percent of respondents couldn't name something they had done. Full results are here.

The report was provided as an insert in the local newspaper, and was translated into Chinese and inserted into three Chinese newspapers.

The report has changed over time. They use an online poll for “citizen graders” – and had 1070 responses in 2008, three times that of 2006. These "graders" assign letter grades to each section, which creates some controversy and pushback within the community (political leaders like to point out that Vancouver is consistently ranked as one of the top cities to live in the world, and yet the graders give it a C+.) They also conduct a more scientific sampling of the population through a telephone poll – with 854 responses in 2008.

The other major change was the geographical shift in 2007 from the city of Vancouver to Metro Vancouver, giving the report a more regional focus.

Take a look also at Youth Vital Signs, a community indicators report designed and developed by youth to reflect their perspectives and ideas about what's important and what's happening in metro Vancouver. That report also built on new uses of technology -- one input came through text messages, where a message was sent out and returned 3000 responses in one day.

The next session I attended was How Creative Partnerships Improve Indicators, with Sandra McBrayer and Paula Ingrum from The Children's Initiative in San Diego.

Sandra McBrayer began by discussing the origins of the Children's Initiative, which was born because foundations needed to figure out why their money didn't matter. They provided resources, but the underlying problems didn't seem to be improving. So they decided to bring all the stakeholders together to be part of the process of improving the lives of children, helping them all gain ownership of the problems and encouraging them to meet together often. The county had been doing a San Diego report card for years (since 1999) but it was a data document only – when you opened it up, you saw numbers and graphs but you didn't know what the report was telling you.

So they looked at report cards across the country, trying to figure out what are the best examples of report cards and why are they the best. If we called the top agencies in a community and asked them about the report card and they didn't know what it was then that report card wasn't a model we wanted to follow. Quality report cards shared several characteristics:

  • They raise community awareness
  • Community partnerships are key to sustainability
  • Multiple funding streams are necessary
  • They link what is learned to a process for change

They transferred the responsibility for the report card from the county health department to the Children's Initiative. Building partnerships is critical – not about blame, but shared responsibility. Created a Leadership and Scientific Committees-- calling it “Leadership” made it special. Scientific Committee consists of epidemiologists and biostatisticians. They used the Results Based Accountability model to select the indicators.

So look at the 2007 San Diego County Report Card on Children and Families(PDF). Each indicator includes the following – why is it important, what are the national best practices, do we have that in San Diego, if not who are the partnerships who share the responsibility to make it happen. They rethought the indicators by asking:

What is this data telling us? Do we understand it and does it make us want to do anything?

What is the area of real concern? What do we really want to know?

Is the data we want available? Where do we get it or how do we make it?

Then they put on the dashboard things they didn't have but wanted to develop the data. Some of the keys they shared for their success:

  • The personal touch matters in building partnerships. They provided food at their meetings, hand-written thank you note for providing data, $5 gift cards to Starbucks.
  • Focus on making change. "If you look at an indicator and can't tell what you're supposed to do about it, it's not a good indicator."
  • Get the right data to focus on prevention. They looked at youth involved in alcohol-related traffic crashes. They mapped the crashes – and busted the community myth that it was simply a border/Tijuana issue. They built partnerships and relationships to be able to use DMV data to find out where the kids lived and could then focus in on who's drinking and driving. It wasn't the military -- another flawed perception. They were able to show that it was mostly white, middle-class kids in certain rural areas. They developed partnerships, including insurance companies, driver's ed instructors, schools, law enforcement, etc. Could now focus on targeting prevention activities to those kids who were most at risk.
  • If the data you need doesn't exist, work to create that data. They brought a focus group together around domestic violence, and asked: What do you want to know? It wasn't the rate of domestic violence reports filed; they needed to know what children were exposed to domestic violence. Created new DV supplemental form, shared as common form among all police agencies. Missed a step – brought together police chiefs, but not the data analysts as part of the partnerships. So forms collected, but not always inputted. Now we know to build partnerships with the data side as well. If you don't forge the relationship with the people who do the work with the data, you stuff becomes the bottom of the pile – and no one ever gets to the bottom of the pile.
  • Use research to determine most relevant data. School districts were touting Average Daily Attendance as a measure of student attendance -- and by this measure, they were doing well, with ADAs around 95%. However, studies showed that missing 10% of school in secondary school (18 days!) and 5% in primary (9 days) has serious academic consequences. Now they track data, by school, grade, district. (They have 42 school districts in their county!) The data showed serious problems -- as well as some places where they were doing it right and could learn from those efforts. Then Hedy Chang and MariaJose Romero in Center for Children in Poverty released their report Present, Engaged and Accounted For. (Hedy Chang was sitting near me, which was cool to have an author's work cited and have them in attendance.) The report promotes using this threshold to measure attendance.
  • Never use data to sensationalize an issue. The level of data (school-specific) is internally owned. If media sensationalize an issue, it causes partners to walk away from the data. No scores on report cards – it would turn them off. We want to be better, not place blame, so we intentionally did not do that.

They closed by adding these thoughts: Right now, all of our work is based on data. How many kids does this effect, and who should be leading this effort. We have a priority list – criteria which all new projects must meet. Our role is a critical one in the community -- we don't do direct service, but we change the life of kids.



That gets us to lunchtime. Lunchtime on Thursday at the Community Indicators Consortium conference had a keynote speaker - Stephen Bezruchka, from the Departments of Health Services and Global Health, School of Public Health, at the University of Washington. He spoke about the kinds of indicators we should be measuring to reshape U.S. policy towards health, especially in terms of improving American life expectancy. I found this interview and transcript online where he covers many of the same points he did at lunch, so this should compensate for my inability to eat and take notes simultaneously.

After lunch, I heard Tad Long from NewCity Morehead speak on measuring civic engagement (I already pointed to his presentation online, so I won't add much here.) Following the same theme of Community Engagement and Mobilization, Sandra Noel and Mary-Louise Vanderlee from Niagara spoke of how they used the children's rights framework to engage their community around measurement and action. They will be releasing their work to coincide with National Child Day on November 20, so I'll hold off on describing more about the project until I can share the results.

The last session on Thursday that I attended focused on New Tools for Data Visualization. Scott Gilkeson, from State of the USA, presented on Data Visualization Fast and Cheap. His goal is to show ways to put information and data up on the web. Many Eyes, from IBM, is a few years old. You can upload data to the web and then visualize it in one of 18 data visualization opportunities. Then you can copy the code to your webpage and show the visualization there. Let's say you have downloaded data from the Center for Medical Statistics website on national health expenditures. The data has to be configured in an Excel spreadsheet to match many-eyes conventions. Then you can log in to Many Eyes (registration is required, but it's free) and paste in the data. The site will show you that it understands the data you have entered, and prompts you to enter in the title, source, tags, description, and other information about the data. Select the data visualization type and the graph appears. You can then grab the embed code and put it on your own site or blog. All of the interactivity is available on your site.

Google has also made available data visualization tools. Uploading your information into Google docs, and again arranging the spreadsheet according to Google conventions, you can insert a Gadget. This allows you to choose a chart type. After formatting the chart the way you want it, you can publish the gadget, and it will give you code that you can publish online. Again, registration is required, and it's free. This process is limited because the metadata information doesn't automatically move with the graph.

However, you can use Google's API to create your own customized code. Scott said that they've just finished an open source product based on Google gadgets that would allow you to create a full metachart. He'll be putting the information together in a clearer fashion and making the code available shortly for all Google docs/gadgets users. (It should be at the Google Code projects page shortly,under "metacharts" --I'll link directly when it's available.)

Alex Baumann from University of Massachusetts-Lowell spoke next about An Open Source Resource for Data and Indicators. Seven founding members, complex entities, came together in an Open Indicators Consortium. We wanted to get high performance/large dataset visualization tools available for people to use.

This is an agile development process – updates released regularly to members, and many will be releasing their results next week. Want to make this a good, robust open source product free for nonprofits.

Second year they'll add personalization, collaborative visual tools, integrated voicechat, flexible configuration, controlled/secure data access, and ontology/middleware to allow comparisons between OIC member and National Data Commons sites.

What followed was a series of demos, which were fun but describing them is hard so I won't. It was good stuff. It's a work in progress – will be in three levels, novice, middle, advanced, right now working on advanced and then will create novice-level later with fewer features. The product features multiple layers, different shape files, animated probing. You can click on data outliers and see it on google maps, wikipedia, etc. Mouseover and get the data and name. Right-click and you can search for data in google. Data is downloaded on demand -- they want to be able to scale to very large data sets/detailed geography. Tools right now can be embedded in a website, and are working on being able to embed a specific exploration onto a page.

Question: Will you build api's to connect to the data sources? Right now data stored in databases so you can do complex queries and scale the data-- have to load data in and tag it with metadata. Have to be able to link your data to a geography. It uses compressed shapefiles – stream in detail as you zoom in.

John Bartholomew, from GeoWise, presented on InstantAtlas: Interactive Indicator Presentation in Maps, Charts, and Tables for the Web. He began by showing some of the kinds of ways people are using interactive mapping as part of their data display/sharing efforts, and then went to demos.

Business case: more powerful open source and commerical graphic and mapping tools to help engage commitment to priority community issues. Mapping and data can sensitize policy makers to priority needs and empower local communities over local issues.

Challenge: scarce skilled resources in the public sector. Data in government is presented primarily in static formats. Restrictive Itpolicies present hurdlesto adopt new reporting media.

Samples of interactive mapping. Want to provoke discussion, not a sales pitch. Sometimes single platform is best, sometimes combination is better.

Healthmap – implementation on google map background. Easy to see, hard to quantify data.
Rhiza labs – H1N1 tracking – user-contributed data then mapped.

Heat maps – pioneered in Scandinavia – hard to allocate resources on blurred contours. Make sure visualization serves intended purpose.

Statistical relationships between indicators – how do you do it? Circles on colored backgrounds is the way we used to do it-- is it always the best?

With microsoft comes powerful flex api implementations, but require skilled developers.

WHO – using multiple tools/platforms to get information to help countries withoutsophisticated technical resources?

Good practices for mapping include:

  • Ease for audience to grasp
  • Intuitive interactivity
  • Audience-appropriate
  • Design focused on promoting valid, evidenced-based conclusions

(Those four points are critical. My two cents: DON'T FORGET WHO YOUR AUDIENCE IS! We go overboard sometimes creating stuff that intimidates and confuses rather than invites and informs.)

He went to demos, which are available on their website. One demo I would like you to check out is this one: JCCI (click on Community Snapshot to see more). That's the site I just launched last week using InstantAtlas technology.

After that session was a social/reception, then we wandered off for sushi and more conversation. All in all, a really good beginning to the conference.




0 comments:

Post a Comment