Community Indicators for Your Community

Real, lasting community change is built around knowing where you are, where you want to be, and whether your efforts are making a difference. Indicators are a necessary ingredient for sustainable change. And the process of selecting community indicators -- who chooses, how they choose, what they choose -- is as important as the data you select.

The Jacksonville Community Council (JCCI) understands indicators and community change, with more than 25 years of producing the annual Quality of Life Progress Report for Jacksonville and the Northeast Florida region, and two decades of helping other communities develop their own sustainable indicators projects. JCCI consultants give you the information you need to measure progress, identify priorities for action, and assess results.

I'd like to talk with you personally about how we can help. E-mail me at
ben@jcci.org, call (904) 396-3052, or visit CommunityWorks for more information. From San Antonio to Siberia, we're ready and willing to assist.


Tuesday, October 27, 2009

Follow OECD World Form LIVE

The 3rd OECD World Forum on Statistics, Knowledge and Policy will address some crucial questions that today have become more important than ever. This OECD World Forum, focuses on Charting Progress, Building Visions, Improving Life and will attract very high level participants with a mixture of politicians and policy makers, opinion leaders, Nobel laureates, statisticians, academics, journalists and representatives of civil society from all over the world.  Please see the Agenda.
WebcastWebcast
If you wish to follow this exciting Forum, then view the webcast and chat online from the  3rd OECD World Forum Webcast starting live from Busan, from the 27th to the 30th October 2009.

World Forum Twitter Account  we will be using Twitter during this event, so please follow us there with your comments and use hashtag #OECDWF when you write about the 3rd OECD World Forum
    


The 3rd OECD World Forum is organised by the OECD and the Government of Korea (Statistics Korea KOSTAT) in co-operation with the United Nations, the Organisation of the Islamic Conference, the European Commission and the World Bank, as well as other sponsors and partners.




Read more ...

Monday, October 19, 2009

New U.S. Gross National Happiness Index Implemented!

We've talked about Bhutan's Gross National Happiness Index before. Now we have a Gross National Happiness Index for the United States, updated on a daily basis, brought to us free ... by Facebook.

Here's how it works:

Every day, millions of people share how they feel with the people who matter the most in their lives through status updates on Facebook. These updates are tiny windows into how people are doing. They're brief, to the point and descriptive of what's going on this week, today or right now. Grouped together, these updates are indicative of how we are collectively feeling. Measuring how well-off, happy or satisfied with life the citizens of a nation are is part of the Gross National Happiness movement. When people in their status updates use more positive words--or fewer negative words--then that day as a whole is counted as happier than usual. (To protect your privacy, no one at Facebook actually reads the status updates in the process of doing this research; instead, our computers do the word counting after all personally identifiable information has been removed.)

The New York Times quotes Adam D. I. Kramer, the creator of the index, as saying: “When people in their status updates use more positive words — or fewer negative words — then that day as a whole is counted as happier than usual.”

Adam explains the methodology for the index in this Facebook blog post. Check it out and see what you think.

(Hat tip: ISQOLS)

Read more ...

Thursday, October 15, 2009

Free PPMRN/GASB Webinar

Share your comments on the Proposed GASB SEA Guidelines

Tues., Oct. 20 12:30-2:30pm (ET)The Public Performance Measurement and Reporting Network (PPMRN) will host a FREE online webinar / audio-conference featuring members of the Governmental Accounting Standards Board (GASB) team who will answer questions about the Proposed Voluntary Service Efforts and Accomplishments Reporting Guidelines.

PPMRN hopes to encourage wide participation and to provide constructive feedback to GASB on the content of this proposal. Please pass this information along - participants do not have to be PPMRN members.We ask that participants read and be familiar with the entire document prior to the webinar.For more information about this webinar, including a link to the document and instructions on how to register for this free event, please visit the PPMRN website at: http://www.ppmrn.net/resources/articles/5749.

Read more ...

Wednesday, October 14, 2009

Call for Papers: Housing Data

From: American Housing Survey (AHS) ListServ <ahs@huduser.org>:

Cityscape is a scholarly journal published three times per year by the U.S. Department of Housing & Urban Development's Office of Policy Development and Research (PD&R). You can read more about it an access past issues at http://www.huduser.org/periodicals/cityscape.html . I am the editor of the Data Shop department, which publishes short (3000 word) articles on the use of data in housing and urban research. Data Shop articles are aimed at researchers in these fields and intended to alert them to new data, novel applications of existing data, and the operational difficulties of data use. The official description of the department runs:

"Data Shop, a department of Cityscape, presents short articles or notes on the uses of data in housing and urban research. Through this department, PD&R introduces readers to new and overlooked data sources and to improved techniques in using well-known data. The emphasis is on sources and methods that analysts can use in their own work. Researchers often run into knotty data problems involving data interpretation or manipulation that must be solved before a project can proceed, but they seldom get to focus in detail on the solutions to such problems."

If you are interested in contributing such a note, please send me an abstract by November 13 in order to be considered for the July 2010 issue. The timeline would be I would notify you of selection by December 1, and I would want a draft by February 1, with a final version by February 19. If you are interested in making a contribution but cannot meet these deadlines, please send me an abstract for possible publication in later issues.

Dav Vandenbroucke
Senior Economist
U.S. Dept. HUD
david.a.vandenbroucke@hud.gov
202-402-5890

(Hat tip: Glenn Brown)

Read more ...

Portraits of Peel

I first met Srimanta Mahonty some years ago at a CIC Conference and was impressed by the work he'd been doing. He was analyzing a set of quality-of-life factors by population group within Peel, Ontario, Canada, and was demonstrating the inequities and resilience of a range of immigrant populations. His thinking helped me in the growth and development of our own Race Relations Progress Report.

His work has continued. He just sent out this note on his new, updated website:

The Portraits of Peel website provides three types of information:

Please forward this information to your networks as appropriate.

Thank you.

Sincerely,

Srimanta Mohanty, Ph.D.
Director of Research & Administration
The Social Planning Council of Peel

Take a moment and check it out!

Read more ...

Monday, October 12, 2009

State Health Data Scorecard

Here's an interesting site to play with -- it maps out a series of close to 40 health indicators by state in four areas: Access, Prevention & Treatment, Avoidable Hospital Use & Costs, and Healthy Lives. It then adds one more category, Equity, and measures indicators of equity across income, insurance coverage, and race & ethnicity.

So take a look at the Commonwealth Find's State Scorecard 2009 for some interesting indicators, trends, and state rankings. The indicator set is an intriguing one to consider as we think about health indicators on a local level, and the set should make us think a little bit about the federal and state context within which we measure local health indicators.

(Hat tip: kuri)

Read more ...

Call for Chapters: Best Practices in Community QOL Indicators

Call for Chapters
Special Volume on
Community Quality-of-Life Indicators: Best Practices V
Published by Springer in the new
Best Practices in Quality-of-Life Research Book Series

Volume Focus: This volume will publish best practices of community quality-of-life indicators projects. The first volume was published by Kluwer Academic Publishers in 2004 (edited by M. Joseph Sirgy, Don Rahtz, and Dong-Jin Lee). The second volume was published by Springer in 2006 (edited by M. Joseph Sirgy, Don Rahtz, and David Swain). The third and fourth volumes were published by Springer (visit www.springer.com and type “Community Quality-of-Life Indicators” in the Search window) and the International Society for Quality-of-Life Studies (both edited by M. Joseph Sirgy, Rhonda Phillips, and Don Rahtz).

For the fifth volume, we are seeking excellent case studies that can be used by community planners, policy makers and others as good examples or “prototypes” of community quality-of-life indicator projects. Papers dealing with theoretical issues in planning, developing, and using community quality-of-life indicators are not suitable for this volume. Instead, they should be sent for review and possible publication in Social Indicators Research (SIR) or Applied Research in Quality-of-Life (ARQOL). The fifth volume will be published in the new book series, Best Practices in Quality-of-Life Research. The book series editor is Dr. Dave Webb of the University of Western Australia.

Volume Editors: M. Joseph Sirgy (Virginia Polytechnic Institute & State University, USA), Rhonda Phillips (Arizona State University, USA), and Don Rahtz (College of William and Mary, USA)

Submission Deadline: December 31st, 2009

Submit to: M. Joseph Sirgy, Department of Marketing, Pamplin College of Business, Virginia Tech, Blacksburg, Virginia 24061-0236, USA. Tel: 540.231.5110. Fax: 540.231.3076. E-mail: sirgy@vt.edu

Submission Guidelines:
  • The paper should be typed in either Arial or Times Roman, font size 10-12 with a margin of 1 inch on all sides.
  • The paper should be typed either 1½ or double-spaced.
  • Paper length should not exceed 30 pages in total including references, tables, and figures.
  • Reference style: American Psychological Association (APA) style is preferred.
  • E-mail attachment is the preferred mode of submission. Submit paper electronically to sirgy@vt.edu.
  • All submissions should be original and not previously published. The submitted paper should not be submitted simultaneously to other publication outlets.

Guidelines for Paper Selection and Final Manuscript Preparation:

  • Each paper will be subjected to a review by 2-3 referees who are experts in the field.
  • The editors in consultation with the referees will make the final decision concerning acceptance or rejection.
  • Notification of acceptance or rejection will be sent out by the end of January 2010.
  • It is very likely that the editors will request changes to the accepted papers based on the reviewers’ suggestions. We will forward a production schedule once all papers are reviewed.

Read more ...

Thursday, October 8, 2009

CIC Conference, Part Three: Another Perspective

Here's a real treat for you. Glenn Brown, of the Children's Board of Hillsborough County, took notes at the Community Indicators Consortium 2009 Conference in Seattle, “Community Indicators as Tools for Social Change: Tracking Change and Increasing Accountability.” He went to different sessions than I did, with some overlap, so his notes provide a new perspective and greater detail on the conference. Thank you, Glenn, for allowing me to share these notes with this blog audience. (If anyone else has notes to share, please let me know!)

Part One of my conference notes is here.
Part Two of my conference notes is here.

Click the link below to read Glenn's conference session notes.

DAY 1 10/1/2009
Opening Plenary
Richard Conlin, City Councilman and President and Co-Founder of Sustainable Seattle

With the growth and development of Seattle rose a consciousness that this progress should be planned – and so rose the organization of Sustainable Seattle – and in so doing, there was a realization that the process requires measures. Early on there was a feeling that there were challenges between what could easily be measured and what they actually wanted to accomplish. After numerous public meetings with thousands of Seattle residents, it was decided that the indicator to be used as a major measure of their overall accomplishment was (and is) “wild salmon returning to spawn.” It is measure that encompasses many elements which touch on the four key values of economics, environment, social justice and community; and it carries a great deal of symbolic value for the residents of Seattle. It is also as challenging a measure as it is important.

This has led to a lot of learning and thinking about the context of measures. He spoke of some examples they have been working on in Seattle. He presented a story about solid waste disposal in Seattle and how this led to efforts toward recycling. He went on to report that the dialogue around this grew and it was concluded that, recycling is not an end unto itself, but a means to reducing waste. In the case of affordable housing in Seattle, he reported that they don’t know if policies are actually accomplishing the goal of affordable housing (something considered to be an end) because the policies don’t have actual measures tied to them. And finally he spoke of Seattle’s efforts to address climate change on the local level. The City adopted the Kyoto protocol (even though the Federal Government has not), and have been able to set policies and measures in place which indicate that they will meet the Kyoto standards sometime in 2010.

There was a question regarding community engagement – he reported that the city charged local communities with developing their own plans and gave them the money to hire local people to carry this process out. All and all, 35 communities were involved and over 20,000 residents participated.

First session:
Integrating Community Indicators and performance measures.
The presenters in this session were Allan Lomax (consultant, formerly with the GAO); Cheryle Broom (Auditor for King County, WA); and Karen Hruby (from Truckee Meadows Tomorrow, NV). Presented a matrix in development that is being put together in partnership with the Sloan Foundation which maps out the development of projects in regard to their integration of performance measures and community indicators. The Sloan Foundation with CIC will be publishing “Four Real Stories,” regarding various efforts around the country on an annual basis. Much of the presentation focused upon the development of the matrix, included discussion of it being altered into more of a lattice type structure and the possibility that perhaps a 3rd dimension is needed to capture a better description of these sorts of projects.

Second Session:
Also Integrating Community Indicators and Performance Measures – presenters: Julia Joh Elligers (Senior Analyst, National Association of County & City Health Officials); Erica van Roosmalen (sociologist for the Halton Catholic District School Board); and Rhonda Phillips (professor, Arizona State University).

Julia started and presented the Mobilizing for Action through Planning and Partnerships (MAPP) process used by health departments across the country as an effort to improve public health. The method is viewed as a community-wide strategic planning tool and a method for communities to prioritize public health issues as well as identify resources for addressing and taking action upon them. There are six map phases: 1) organizing and developing partnerships; 2) visioning; 3) Four assessments, a. community themes and strengths (community indicators), b. community health status (community indicators), c. local public health system (performance standards and measures), d. forces of change; 4) Identify strategic issues; 5) formulate goals and strategies; 6) Action Cycle. The six phases are iterative. The entire process tends to move from qualitative to quantitative data.

Erica presented on Halton Our Kids Network project (Canada). Project originally began with an assessment of the community building blocks, ranging from parent and caregiver skills and supports, economic security, to health and safety. This included an array of partners in the public, private, and government sectors. The evolution of the network led to the inclusion the Search Institute’s 40 developmental assets with agreement that the information would be collected from all 21 neighborhoods of Halton. They grounded these assets in the ecological model of Bronfrenbrenner and agreed that Results Based Accountability would be the method of choice to show how service integration is critical for collective action. They choose seven population results and four performance results and began to work toward turning the curve. Ultimately the intent is to show how much will be done, how well they did it, and answering the question as to whether or not anyone is better off.

Rhonda has been looking at economic indicators and stressed that traditional economic indicators are not measures of quality as they tend to focus on the measure of consumption: purchase of local goods by individuals; purchase of local goods by government; purchase of local goods by businesses; and purchases of goods by others outside the local area. This is why the current crisis was not anticipated. What is needed are more comprehensive indicators, we need to think outside of growth, and we need to focus better, not bigger, development. Growth as we have known it may not ever return. We need to look beyond economics for understanding quality of life. Looking at consumption, we can look at revitalizing downtowns and neighborhoods (with measures such as retail v.s. green space – Burlington VT has done this in their Legacy Project); for investment, we can look at businesses buying local goods and services as opposed to those outside (this is also a sustainability issue, and again, Burlington VT is an example); for government, this too includes an examination of buying local goods and services v.s. outside goods and service, and can include looking at organizations combining purchasing power (another sustainability issue); and finally, addressing exports, indicators include the type and volume of products outside the home, the market area should be a venue for sustainability (for a good quality of life) wherein companies that practice sustainable business practices should be targeted and supported. Indicators from Burlington include the number of full time workers earning above the livable wage (looking not just at job creation but the quality of the jobs). Housing and jobs ratio as a way to check the sustainability of the workforce; and civic investment should include things like the economic diversity of arts and cultural based businesses and organizations.

During lunch there was a presentation by Dr. Stephen Bezruchka (from the Departments of Health Services and Global Health, School of Public Health, University of Washington), “You Get What You Measure.” He began by speaking about the notion that people focus attention to things they feel are important, and that we as a nation focus on the Gross Domestic Product (GDP) – up to the second using the stock market – and raised the point that the GDP, a measure of consumption, is not a measure of well-being. If it were, then because continuous growth is desirable, cancer would be considered a good thing. He made the provocative statement that in the big picture, individual behaviors are relatively unimportant (he gave the example of the average life span of a Japanese citizen compared to a US citizen, noting that the Japanese live longer – however, there is far more smoking among Japanese people than US citizens). He stated that it is in the trends that we infer causality and the trends imply that inequity kills. More egalitarian societies have better health. He showed numerous slides comparing various countries to other countries. He concluded that changing the way wealth is shared has the largest impact on health. He also reminded us all that healthcare is different than actual health and made the political statement that government needs to work for people rather than banks.

Session 4 was a session on community engagement and mobilization, with presentations from Tad Long (New Cities Institute), and Mary-Louise Vanderlee (Brock University, Canada) with Sandra Noel (Niagara Region Public Health). Tad presented an Evaluation of the NewCity Project, in Kentucky (ppt. on website, http://www.newcities.org/) . They wanted very specific indicators to measure civic engagement. To engage the community, they set up local initiative committees and charged these groups with collecting the data and being accountable. The importance was the community “moving the needle” not the facilitators. This required less traditional data collection, ranging from going to local soccer games and speaking with parents on the sidelines, to going to farmers’ markets and BBQs. To get the community engaged, they had to engage the community. Also shared Morehead State University website as location for many tools: http://www.kysi.org/ .

Sandra and Mary-Louise presented a project done in Niagara Canada, using service as a catalyst for community mobilization. The process has taken a year longer than anticipated as working groups for the report have ebbed and flowed. They based their engagement model on the work of Peter Block – specifically finding his book, Community Conversations to be most helpful. This strategy entailed identifying a continuum with a core group, a target group and then scale up to the entire community. They used Results Based Accountability to generate their report and are looking forward to bring it back to participants and stakeholders.

The last session of the day was on data visualization. Presentations were given by Scott Gilkeson (from The State of the USA, http://www.stateoftheusa.org/ourwork/website.asp ), Alex Bourden (Graduate Assistant at University of Massachusetts), and John Bartholomew (GeoWise Limited – Instant Atlas, Scotland).

Scott presented on two very useful, free applications for data visualization that can be found on the web: Many Eyes (http://manyeyes.alphaworks.ibm.com/manyeyes/ ) and Google Docs (http://docs.google.com/templates?hl=en ). In both cases, these applications allow not only for the upload and visualization of data, but also allow the user to copy the code for the visualization thereby enabling the user to transfer the visualization into one’s own web page. Of the two, Many Eyes is more dynamic.

Alex presented on an open source application that under development by the Open Indicators Consortium (http://weblab.cs.uml.edu/ivpr/micoviz/ ) and is funded by the Boston Foundation. It combines tables and graphs with a mapping application. The map also links to Google maps and Wikipedia for more information on any specific area. This software is free for individuals, non-profits, and government agencies.

John presented the commercial application, InstantAtlas ( www.instantatlas.com/cis) . It also creates a data dashboard to go with a map. It comes with ready to use templates that can be customized for use and there is a minimal amount of technical knowledge needed to use the application. Reports are stand-alone web pages so they don’t need to be installed on a server to be used.

DAY 2 10/2/2009
Morning Plenary: If You Can’t Measure It, You Can’t Manage It: People, Progress and Persuasion.
Jon Hall, OECD, Head of the Global Project on Measuring the Progress of Societies (http://www.stiglitz-sen-fitoussi.fr/en/index.htm ). He began by speaking about his work on the Stiglitz Report presented in September this year in France. He stated that what we measure reflects and determines our values. Unfortunately, GDP has been the primary measure used – a measure of growth; we should be measuring our welfare. He stated that the world is presently in a “mid-life” crisis – we have stuff, and yet we are not happy. We have to redefine development and progress and the Stiglitz report set out to establish this new definition with the creation of new sets of indicators. This ended up being more complicated than expected as it quickly became clear that the issue of defining what progress means is culturally relative. This leads to questions, what does measuring quality of life mean for policy; how do we turn evidence into change; and how do we do this with meaning as it is progress, not simple outputs? Frameworks take on great importance. He used the example of a study on the happiness of nuns and the impact of whether they were happy or unhappy – the study found that unhappy nuns live an average of 10 years less than happy nuns. www.oecd.org/progress

The next session I attended was titled: The Power of Neighborhood Indicators. Presentations were done by Kathy Pettit (The Urban Institute), Tom Kingsley (The Urban Institute), and Charlotte Kahn (Boston Foundation).

Kathy began the presentation/discussion speaking about the National Neighborhood Indicators Partnership, founded in 1995 to assemble common indicators from local partners with the intend to allow analysis across sites to draw lessons for nation policy. At that time, this was not a feasible thing to do. Since then there have been advances in the nation’s data, an increase in local partners, and expansions of neighborhood level data holdings.

Charlotte elaborated upon this, speaking of what she felt was once a divide between neighborhood level and community level indicators, feeling that the difference in scale and the issue of causality often used to be an impediment to an integrated framework. Most recently work has been done regarding the topical framework of community indicators and cross cuts, that it now seems that the integration of neighborhood and community indicators are more logical.

Tom pulled this together reporting that the Census Bureau has become a partner with NNIP as it is moving more toward using the American Community Survey and phasing out the decennial census. The Census is working to integrate neighborhood indicators. The challenge of looking at indicators across cities requires some structure as definitions require continuity, and metropolitan context has an influence on how these indicators are defined. All NNIP members can map indicators. This allows for a visualization of trends that allow for policy nuances among areas – teen pregnancy or sub-prime lending can have very different specifics on the neighborhood level that is missed when examined on a broader scale.

The second session of the day was also in the Neighborhood Partners Track, with Linn Gould (from a non-profit called Just Health Action) and Cynthia Updegrave (University of Washington); they presented a service learning project done jointly among Just Health Action, University of Washington and Ritsumeikan University in Japan examining the connection between racism and the environment. The course is from an evolving discipline called Environmental Justice and has its foundations in the work of Robert Bullard. The project brought students followed an industrialized river basin to a community that abuts it and looked at the environmental issues that residents there deal with on a daily basis. They used a version of the E.P.A. toolkit for this process, which moves from qualitative data collection to quantitative data use.

The final session was a round table on the use of social media in indicators projects led by Karen Hruby (Truckee Meadows Tomorrow) and Allen Lomax (consultant, formerly from the GAO). The discussion was based on the questions of how to use new media in indicators projects, and can community indicators improve the use of social media for better communities. Conversations spanned a broad range of issues and ideas: real-time qualitative data; social use vs. professional use; how to integrate performance measures and indicators; making and bridging connections; greater sharing of stories. Rutgers was recommended as a resource with their Public Performance Measurement Reporting Network.

The Conference ended with a discussion of future directions of the CIC and a session on building regional and affiliation networks.


Read more ...

Wednesday, October 7, 2009

Vehicle CO2 Emissions in Japan

This is interesting -- it's a map of vehicle CO2 emissions by municipality in Japan.


Here's how they calculate the data:

The CO2 vehicle emissions shown on the map use estimated emissions figures by municipalities nationwide, calculated using an estimation method by NIES based on the Origin-Destination Survey data of 1999 and 2005 road traffic censuses. Clicking on the map for "emissions per person" reveals figures for calculated annual emission volumes from passenger cars, freight vehicles, and the total number of vehicles. The same information is shown for "total emissions."

More information can be found here:

http://www-gis5.nies.go.jp/carco2/co2_main.php (Available in Japanese only)

The National Institute for Environmental Studies (NIES) official website
http://www.nies.go.jp/index.html


Read more ...

Tuesday, October 6, 2009

Economic Stress Index

Take a moment to play with the Economic Stress Index put together by the AP. It measures unemployment rates, foreclosures, and bankruptcy filings and combines them into an index using this formula:

[1 - [(1 - unemployment rate) x (1 - foreclosure rate) x (1 - bankruptcy rate)]] x 100

The data are available for every county in the United States with a nice mouseover function on a color-coded map.

Take a look, and think about (1) how the data might be useful in your local community indicators project and (2) how the visualization might be a nice kind of tool for you to use.

Read more ...

Monday, October 5, 2009

Tom Paper at CIC Conference: Battling For Our Minds

Here is Tom Paper's presentation to the Community Indicators Consortium conference.


For more information about Tom, see his project website at Data 360, which he describes as "a Wiki for data."

Read more ...

Saturday, October 3, 2009

CIC Conference 2009, Part Two

(Part One of my conference notes is here.)

The conference has lived up to my expectations. If you weren't here, you're missing something important.

Friday began with a presentation by Jon Hall, who leads the Global Project on Measuring the Progress of Societies at OECD. I found this presentation (PDF) online, which is quite similar to the one he gave, so until I can get the actual presentation, this should give you a flavor of what we talked about.

Read more by clicking the link below. NEW: I have now added in my own comments from the last panel session.

Their philosophy is straight-forward: How to measure is a technical issue, and we are developing best practices. What to measure is a political issue. We can advise on how to set up a process that is legitimate and reflects the shared values of a society. But the choice belongs to the society.

Four questions for the 3rd World Forum in Korea:

  • How is the world progressing?
  • What do these new paradigms for progress mean for policy makers?
  • How can evidence promote social change?
  • What institutions does the world need to take this forward?

A new handbook on measuring progress will be launched at the Korea conference. They are also developing a framework on a Taxonomy for Progress, identifying quality frameworks for indicator sets, and collectingand sharing lessons about successful sets of indicators.

Several themes emerged as they developed the Global Project Framework:

  • Be clear about your objectives and how you expect to achieve them
  • Be realistic about what an indicator set can achieve
  • Never underestimate the importance of the process
  • Think long-term; be persistent and flexible



The first breakout session of the morning that I attended was on The Power of Neighborhood Indicator Systems. Kathy Pettit and Tom Kingsley from the National Neighborhood Indicators Partnership and Charlotte Kahn from the Boston Foundation were the presenters.

Kathy began by describing NNIP's work. She promised that her presentation would be up on the NNIP website shortly -- I'll link to it as soon as it is.

NNIP has 33 partners. Each partner has their own network of partners – so each partner is really a contact point that links local networks together with other local networks. They are also working with LISC and MacArthur on the Sustainable Communities Initiative.

Charlotte spoke about the idea of a shared indicator system. It's been a dream since NNIP began in 1995, but at the time it wasn't feasible. Each of the 32 partners has its own community statistical system – which is different from indicators. This is a new opportunity to think harder about these same sets of measures and have deeper conversations among ourselves and across cities and understand patterns and draw conclusions that will be of great interest to policy makers, especially in context of preparing for the 2010 Census. Have a working topical framework – 10 areas and four cross-cutting topics.

Tom asked, How do we operationalize the system? Chris Walker from Sustainable Communities is helping drive this partnership. Have a draft plan, need to raise funds, goal to combine with 2010 Census. ACS at neighborhood-level is only 5-year averages and won't help us get the information we need to understand how distressed neighborhoods are reacting to the current crisis.

Charlotte pointed out that their are some language/definition barriers between community statistical systems and community indicator systems: indicators are not the same as data variables, but are closely related. Data variables provide information for indicators. For example, the percent of the population receiving TANF assistance is a variable, but it is also data that supports an indicator of family poverty/economic hardship.

Building a shared/common neighborhood-level indicator set across metro areas raises questions. How do you define "neighborhood"? The partnership will likely use census tracts as common framework, but each partner would then define their own neighborhoods. A key component is to have no rankings and no labels for neighborhoods, but instead to use the data to tell good stories about distressed neighborhoods. The challenge is that administrative data is often deficit-oriented, when sometimes what we need are indicators of neighborhood assets. The Urban Institute has also been sponsoring an effort to create a framework for developing art and culture indicators -- very different kinds of indicators from property-lvel data systems.

A challenge is to develop a robust set of national data policies that address the timeliness, accessibility, and compaarability of data from very local through national levels. The Annie E. Casey foundations has developed a white paper on data availability for legislative/adminstrative advocacy – for example, great strides in data could be available if we could double the sample size of the American Community Survey. We need a joint NNIP/CIC federal advocacy group, addressing data immediacy, availability, and standardization (for example, high school graduation rates, and data availability by student residence instead of school attended.)

I then attended an odd session on Measuring the business value of corporate community initiatives. Jane Coen, from Underwriters Laboratory, Maureen Hart, from Sustainable Measures, and Vesela Veleva, from the Boston College Center for Corporate Citizenship explained a new project of the Center for Corporate Citizenship. The project is completed and will be launched in March or April 2010. The Center has been around for 25 years, and works with companies (350 right now, including 50% of the Fortune 100 companies) and they provide corporate seminars, executive education training, and whatnot for them.

They wanted to develop a practical framework, guidelines, and tools to assist companies in measuring the business value of Community Involvement initiatives. The purpose is to be able to show the business impact of involvement (why it pays to support community causes through employee volunteer time or corporate sponsorship), even in tough times.

Right now companies are measuring inputs and outputs, and not the impacts. Many companies measure “employee involvement” in CI programs but not actual “employee engagement” (there is confusion about what that means).

Why participants want to measure business impact:

  • facilitates benchmarking
  • enhances CI decision-making
  • demonstrates CI's ROI
  • helps gain support and funding or CI initiatives
  • helps integrate CI strategy with core business strategy
  • shows CI's contribution to strategic business objective achievement
  • increases power of CI communications-internally and externally
  • shows correlation between well aligned CI investments and
    reputation/standing within the community
  • name and brand recognition
  • employee engagement (recruitment, retention, satisfaction, productivity)
  • customer loyalty

Why companies aren't measuring business impact:

  • lack of valid and credible tools
  • measuring outputs not impact
  • need guidance on measuring impact

A survey of studies on the business case shows general support for the belief that CI adds value to the firm by enhancing:

  • the corporate license or freedom to operate
  • customer relations and attraction/marketing
  • human resources
  • innovation in market and product development
  • reputational capital
  • financial performance
  • social investments

“It has been very difficult to establish a clear causal relationship between CI and business benefits, and it is challenging to set a dollar value for the intangible assets that CI often creates.”

Draft framework:

  1. Growth/Return on capital: new markets, innovation/new products/services, new consumers/customers
  2. External relations/reputations: corporate reputation, differentiation/customer loyalty, license to operate, investor interest/confidence/stock performance
  3. Workforce capacity: productivity, morale, job satisfaction, retention, recruitment,teamwork, managerial/leadership skills/employee development

For more information, see:

www.bcccc.net/index.cfm?pageId=2025#impact

www.bcccc.net

Questions from the audience included: GRI has 79 performance measurements, but this one will only look at business impact of CI initiatives. At this point not looking at the environment. Easier to measure business impact of environmental initiatives.

How are you recommending that businesses identify their CI initiatives? Are you building into the framework guidance on how to select programs that are strategic to corporate mission? (No, companies will choose their initiatives on their own.)

What is the real connection between community indicator projects in trying to improve the community and what the other actors (in this case, the companies operating within the communities) are trying to do in getting involved in the community? (For this project, not a whole lot, except to show that Measuring Things is Hard. The framework they're developing will not result in increased data for community indicators projects or increased corporate support for community indicators projects, but it will help companies see how much money they make off of community involvement activities as stealth marketing and reputation-enhancement exercises. I told you the session felt a little odd. Somebody else who was there help me out with the conclusions I was drawing -- what did I miss?)

Andrea Whitsett, from Arizona State University, moderated a session on measuring the impact of indicator systems. On the panel were Jon Hall, from OECD; Viki Sonntag, from Sustainable Seattle; Dan Duncan, from United Way in Tuscon and Southern Arizona; and Ben Warner (that's me!), from the Jacksonville Community Council Inc. (JCCI). Because these are my notes, I didn't speak and write at the same time, so my comments are missing. I will fill them in shortly when I think of witty things I should have said. ETA: Added my notes in now of what I meant to say. I didn't add the stories in -- that will have to wait for a different opportunity.

Andrea began by explaining her work at the Arizona Indicators Project, which began in 2007, from the office of the president of the university. Because the project grew so quickly and so organically, there weren't a lot of things in place to measure the impact of the indicators project.

The panelists identified themselves, then Andrea asked us all to define how we measured impact.

Dan: The national organization set national goals around education, income, and health, and put out the call to local organizations to align with these goals and to see how we could work together collectively. We began a process to re-engineer our impacts using the results-based accountability model and focus on turning the curves. We brought together a large committee, trained them in results-based accountability and asset-based community development, and charged them to be purposeful in identifying what citizens could do, what citizens could do with help, and what only institutions could do. We identified key trends we wanted to work on, including the high school graduation rate and early learning as measured by third-grade learning scores. Around income, we're looking at having more families having more than 200% of the poverty level. Around health, we have a lot of seniors, and we want to keep them healthy and independent, and are measuring nursing home population growth compared to the growth in the overall population.

Viki started from a historical context and the expected impacts they wanted to see. Sustainable Seattle was one of the early citizen groups who wanted to develop community indicators. Our intent was to engage a cross-section of the community in determining the indicators. A result we expectedwas increased citizen involvement in understanding and using the indicattors, and increased responsibility in institutions and policy makers in addressing the indicators we had identified. The assumptions of what indicators could do led to a change in the organization – indicators themselves don't drive change, so we instead began doing neighborhood-level work. Measures of success was how engaged the community was in that process and what actions they took as a result of being involved. We are also interested in data democracy, which means we may be looking at capacity building in obtaining and using information. Our focus is very grass-roots, very community-based social change efforts. How we arrive at that is probably in a discussion with the community to discover if what we provide is serving the community – it's more of an informal process right now.

Jon explained that the global project began because there was a great deal of work occurring but no one to take the lead and support the effort. We're looking at working with leadership and policy makers, and of growing a network of networks. Trying to change how democracies work and change how decisions are made so that people talk about the evidence.

Ben: We measure impact of our indicators by first identifying what we intend their impact to be, based on our model of community change.

1. Indicators serve to measure how well the community is progressing in relation to a shared/articulated community vision, and provide the knowledge necessary to forge shared community priorities, inform planning process and strategic actions, and create data-driven decision-making. In order to measure their impact, we need to measure their explicit use in institutionalized priority-setting and decision-making, their implicit use in the same arenas, their function is shifting community conversations and setting a community agenda (through both media mentions and public reference to the indicators in community settings), their use in activities (actions taken or legislation passed) to implement those decisions, and the building of cross-sectoral collaborative partnerships to address the priorities identified in the indicators. In short, we measure (through an annual self-evaluation report) the effectiveness of the indicators in identifying needs and how often they are used in the community in determining how to address those needs. This also includes a survey of our identified key decision-makers in the usefulness of the indicators and an opportunity for them to share with us how they used the information and their suggestions for improving how we report the indicators to better meet their needs.

2. The same indicators are also used to measure the effectiveness of the decisions made and actions taken. The indicators are a key assessment tool to see if conditions have changed, whether the trend line is bending, whether the programs/projects/legislation/activities/allocation processes have addressed the underlying concerns that drove the community priority-setting in the first place. In other words, if your indicators project is effective in your community, the trend lines should show improvement.

Andrea then asked, once you identify the impact you want to have, how do you measure (quantitatively and qualitatively) the impacts of your indicators.

Dan: For the United Way, the five indicators drive all of our activities. Those are large issues, but they are also the indicators that can bring the community together. That's why we are developing performance measures around the strategies we use to affect the community indicators. They allow us to ask, “Are we doing things right, and are we doing the right things?”

We have an online system to track those performance measures and track them on a quarterly basis. Tis helps us bring people around the table to track these issues
and refocus our efforts where necessary.

Viki: Our impacts have pretty much been indirect effects and indirect impacts. The performance measure efforts were an uptake and an outgrowth of the community indicators process. Because we're relatively new, we're working on developing the quality of our systems internally. On the participatory side, one measure of our impact is to look at whether our participation is representative of the larger community. This way we can measure our participatory processes and their impacts.

Jon: Evidence-base policy doesn't exist, at best it's evidence-influenced. Hard to quantify even the influence of GDP on policy. The process and conversation may be more important than the indicators themselves. The hardest distance is the last two inches from people's eyes to their brain. In Australia when we first put these progress measures out I asked the policy makers why they wanted the information. Having these sets of measures is helpful to focus on the things that matter. They can help shape the debate and influence things in that way.

Andrea: To whom are you accountable?

Ben: First and foremost, we're accountable to ourselves. Our continued existence and role in the community is entirely dependent on our integrity and honesty -- integrity can only be sold once. We have a rigorous Credo we test ourselves against to ensure all of our actions are ethical. We are accountable to the community we serve. They expect us to be a neutral, trusted community convener and we need to honor that expectation by providing quality information that responds to their values and visions for the community and needs. We need to be able to serve as the conduit for the voices and stories of people who don't have other ways to be heard. We are accountable to our funders to ensure that we follow through with what we promise to deliver. By maintaining the community's trust, we can ensure our relevance and effectiveness in helping create community change.

Dan: We used to be just accountable to our donors every year. Now with the initiatives we're doing, we're not just the data arm but the organizing table around these issues, and we're accountable to the people that we're trying to help. The key is to keep the focus on the broader issue and create the civic and political will to make the hard choices for the greater good.

Viki: Accountability to the community at large. But what they are holding us accountable to is whether the information has meaning to them and whether it has relevance. This was demonstrated to us when we were working at the neighborhood level and the information we were providing was valid and reliable but it wasn't meaningful to them, and so they disengaged with us. The good news is that the neighborhoods have connected together to develop new partnerships. The challenge we face is to reconnect with them.

Jon: All must be accountable to the community, must be nonpartisan and honest and clear. If you lose the perception of objectivity and appear to be biased, they won't look at the data any more. Newfoundland: before you put the data out, we used to sit and argue over the figures; now we can argue over the issues.

Andrea: How do you demonstrate community impact in order to make a compelling case for funding?

Ben: We use our measurements of effectiveness -- how many people use our work, how often they use the work, in what ways they use the work, etc. In addition, we've compiled a Highlights of Change document that shows over time how critical our work has been in making the community better. For some of our funders, we point to how they use the work to justify its continued funding -- institutionalizing the use of the indicators in their decision-making processes helps support long-term financial support. We point to how others use the indicators to assist them in addressing the same priorities. For others who sponsor our work, we point to how the major community decision-makers use the work, and offer an opportunity for them to put their name on the document in support of that work for the entire community to see.

Dan: We're responsible for the work and the fundraising together. In today's world, donors don't like the middle person, soo we have to talk about the accountability of the United Way system and what we are achieving. To a large part, we're selling aspirations, and we're finding donors and grantors who care about the things we care about. We look for what they care about and try to match their passion and their intent – what the grantor wants and what are their expectations. We now raise more from grants than we do from donors. This comes from owning the issue and owning the results nd being very clear about that.

Viki: In our latest incarnation, we are a data commons. Our primary interest in making sure information is linked to action. The assumption from some of our paartners is that not much action is taking place. We need to interest our partners in understanding that we are bringing the information to inform the action in the community. We see that those people who have a sustainability agenda are interested in sharing their accountability and responsibility and showing how they are contributing to the solutions and successes of the indicators. We're linking that to the indicators themselves on the website.

Jon: We haven't been very successful in raising money because people think we have enough money. Ultimately, this should be funded through government, because this is a public good. We need to point out that if we didn't do this, somebody else would have to, but probably not as well.

Andrea: Tell me about your failures.

Ben: We had early difficulties with our public school system, who saw our indicators as one more assault in a long history of attacks on public education. Naturally, they responded defensively. Data is too often perceived as a weapon until you build joint relationships of trust. Through relationship-building, we helped them see that our intent was not to attack them, but to shift educational issues from being "their problem" to a community problem, which allows for the greater community to take part in the solutions (and to take responsibility for creating success.) This was a much more effective approach.

Jon: We had an initiative around family violence. We saw that the indicators weren't strong enough to make the changes and build the civic/community will around the issue at this time.

Viki: When I think about the trajectory of Sustainable Seattle and the capacity we had developed, we probably didn't have the capacity to sustain a bigger strategic vision. For a while, we lost direction and were in search of that strategic vision.

Jon: Projects tend to fail when they haven't engaged the community from the beginning. You need to build that ownership from the beginning, even if it takes time. Perceptions of bias can be extremely damaging. Because we had built the shared ownership, others could defend us.

You don't want to sensationalize these things, but if you let statisticians write these things they end up really boring. Don't underestimate the difficulty in presenting this information to people.

Andrea: We have had plenty of struggles in getting the Arizona Indicators project off the ground. We tried so hard to be nonpartisan and not interfere and tried to let the numbers speak for themselves. However, the data by themselves were not compelling. People wanted to know both the what and the why, and they need the why in order to hear a call to action.

The closing plenary was facilitated by Adam Luecking. They divided the room into three groups, and asked: What is your vision for CIC in the next 2 years?

Lots of great discussion ensued. I'll let the CIC Board formalize that conversation, but the energy (one person said "vibe") in the room was palpable and positive, which is a great way to end a conference.

Read more ...

Friday, October 2, 2009

CIC Conference 2009, Part One

We're at the halfway point at the Community Indicators Consortium conference. The discussions we've been having and the quality of the presentations have been pretty good -- they've offered me quite bit to think about. Let me share some high points, with the caveat that I couldn't be in all of the sessions and I've already heard I missed some great ones.

Click the link to read my notes from Wednesday morning through Thursday evening.


The conference really began on Wednesday with a selection of pre-conference workshops. I'd like to thank those who braved a three-hour session with me Wednesday morning to talk about Making a Difference with Indicators: What You Need to Know. The conversation was an expanded follow-up of the June 25 webinar, and I'm going to be collecting the material and the comments into a more formal article soon. The only point I want to emphasize is the importance of intentionality -- what results are you trying to accomplish, who is your audience, what are the expected actions you want that audience to take, how does your design cohere to your explicit theory of change -- and the importance of openness to serendipity. In other words, target your efforts, measure your outcomes, and be prepared/design for expanded uses beyond your core. We'll talk more later.

Wednesday afternoon was a meeting of the Community Indicators-Performance Measures Integration Working Group, which is a diverse group of really smart folks trying to figure out best practices and effective techniques for bringing together two similar ways of using data to measure progress. We'll be making a couple of announcements about the progress of this group at today's sessions of the conference, so I'll wait to report on that. At the conference, a Maturity Model of integration is posted for public comment, and a revised version of that model (incorporating the feedback) should be available shortly after the conference, at which point I'll have something more to share and that we can talk about.

Friday morning's opening plenary was from Richard Conlin, of the Seattle City Council and co-founder of Sustainable Seattle. Take a look at what they're doing -- it's not the same organization that once pushed sustainability onto the national agenda, and the organizational transformation would be a fascinating case study. The cutely-named B-Sustainable initiative is going through a re-launch/marketing push. There's energy and good work happening -- the efforts to reduce waste, not just increase the percentage recycled, is a great example -- and Seattle provides a number of lessons learned for anyone working on environmental sustainability. (Side note: they're also an interesting example of how difficult it is to manage the tension between being the community's trusted data source and being effective advocates for a cause. The trade-offs are real and are worth thinking about in your own organization's strategic planning processes -- where will you shine?)

Then I joined Meriko Kubota and Lidia Kemeny of Vancouver Foundation's Vital Signs in a session entitled Partnering for Progress. I shared JCCI's model for community change, as seen below (click on picture to enlarge):


I also shared how this works in practice, following one indicator through the visioning/measuring/prioritizing/planning/advocacy/evaluation processes to show how the teen birth rate was slashed, going from nearly double the national rate to (based on 2008 preliminary figures) below the national rate. Indicators serve dual functions in a powerful method for creating sustainable community improvement.

Vancouver Foundation's Vital Signs are interesting. Canada's community foundations are doing really interesting work with indicators, and Vancouver's no exception. They're the largest community foundation in Canada. They first produced a Vital Signs report in 2006, and followed that with reports in 2007 and 2008. They're not releasing a 2009 update, but are preparing a 2010 report in conjunction with the Olympics.

There are 12 key areas measured in the report, and that framework is worth a look. I especially like the categories of "Belonging and Leadership" and "Getting Started". In 2008, the report focused on the differences between community perceptions and reality, illustrating the sections with fuzzy pictures (perceptions) and then with the pictures in focus (reality). The report tried to distinguish between what was happening and what people thought was happening in Vancouver.

These three survey questions are fascinating. Vital Signs asked:

  1. What is the single most important issue you would like to see addressed to improve the overall quality of life in metro Vancouver?
  2. Give an example of a specific event, action, or other thing that has improved the quality of life in metro Vancouver over the past 12 months?
  3. Over the last 12 months, what actions, if any, have you taken in your own life to make a positive difference in your community?

For question 2, 55 percent of respondents couldn't think of anything. For question 3, 26 percent of respondents couldn't name something they had done. Full results are here.

The report was provided as an insert in the local newspaper, and was translated into Chinese and inserted into three Chinese newspapers.

The report has changed over time. They use an online poll for “citizen graders” – and had 1070 responses in 2008, three times that of 2006. These "graders" assign letter grades to each section, which creates some controversy and pushback within the community (political leaders like to point out that Vancouver is consistently ranked as one of the top cities to live in the world, and yet the graders give it a C+.) They also conduct a more scientific sampling of the population through a telephone poll – with 854 responses in 2008.

The other major change was the geographical shift in 2007 from the city of Vancouver to Metro Vancouver, giving the report a more regional focus.

Take a look also at Youth Vital Signs, a community indicators report designed and developed by youth to reflect their perspectives and ideas about what's important and what's happening in metro Vancouver. That report also built on new uses of technology -- one input came through text messages, where a message was sent out and returned 3000 responses in one day.

The next session I attended was How Creative Partnerships Improve Indicators, with Sandra McBrayer and Paula Ingrum from The Children's Initiative in San Diego.

Sandra McBrayer began by discussing the origins of the Children's Initiative, which was born because foundations needed to figure out why their money didn't matter. They provided resources, but the underlying problems didn't seem to be improving. So they decided to bring all the stakeholders together to be part of the process of improving the lives of children, helping them all gain ownership of the problems and encouraging them to meet together often. The county had been doing a San Diego report card for years (since 1999) but it was a data document only – when you opened it up, you saw numbers and graphs but you didn't know what the report was telling you.

So they looked at report cards across the country, trying to figure out what are the best examples of report cards and why are they the best. If we called the top agencies in a community and asked them about the report card and they didn't know what it was then that report card wasn't a model we wanted to follow. Quality report cards shared several characteristics:

  • They raise community awareness
  • Community partnerships are key to sustainability
  • Multiple funding streams are necessary
  • They link what is learned to a process for change

They transferred the responsibility for the report card from the county health department to the Children's Initiative. Building partnerships is critical – not about blame, but shared responsibility. Created a Leadership and Scientific Committees-- calling it “Leadership” made it special. Scientific Committee consists of epidemiologists and biostatisticians. They used the Results Based Accountability model to select the indicators.

So look at the 2007 San Diego County Report Card on Children and Families(PDF). Each indicator includes the following – why is it important, what are the national best practices, do we have that in San Diego, if not who are the partnerships who share the responsibility to make it happen. They rethought the indicators by asking:

What is this data telling us? Do we understand it and does it make us want to do anything?

What is the area of real concern? What do we really want to know?

Is the data we want available? Where do we get it or how do we make it?

Then they put on the dashboard things they didn't have but wanted to develop the data. Some of the keys they shared for their success:

  • The personal touch matters in building partnerships. They provided food at their meetings, hand-written thank you note for providing data, $5 gift cards to Starbucks.
  • Focus on making change. "If you look at an indicator and can't tell what you're supposed to do about it, it's not a good indicator."
  • Get the right data to focus on prevention. They looked at youth involved in alcohol-related traffic crashes. They mapped the crashes – and busted the community myth that it was simply a border/Tijuana issue. They built partnerships and relationships to be able to use DMV data to find out where the kids lived and could then focus in on who's drinking and driving. It wasn't the military -- another flawed perception. They were able to show that it was mostly white, middle-class kids in certain rural areas. They developed partnerships, including insurance companies, driver's ed instructors, schools, law enforcement, etc. Could now focus on targeting prevention activities to those kids who were most at risk.
  • If the data you need doesn't exist, work to create that data. They brought a focus group together around domestic violence, and asked: What do you want to know? It wasn't the rate of domestic violence reports filed; they needed to know what children were exposed to domestic violence. Created new DV supplemental form, shared as common form among all police agencies. Missed a step – brought together police chiefs, but not the data analysts as part of the partnerships. So forms collected, but not always inputted. Now we know to build partnerships with the data side as well. If you don't forge the relationship with the people who do the work with the data, you stuff becomes the bottom of the pile – and no one ever gets to the bottom of the pile.
  • Use research to determine most relevant data. School districts were touting Average Daily Attendance as a measure of student attendance -- and by this measure, they were doing well, with ADAs around 95%. However, studies showed that missing 10% of school in secondary school (18 days!) and 5% in primary (9 days) has serious academic consequences. Now they track data, by school, grade, district. (They have 42 school districts in their county!) The data showed serious problems -- as well as some places where they were doing it right and could learn from those efforts. Then Hedy Chang and MariaJose Romero in Center for Children in Poverty released their report Present, Engaged and Accounted For. (Hedy Chang was sitting near me, which was cool to have an author's work cited and have them in attendance.) The report promotes using this threshold to measure attendance.
  • Never use data to sensationalize an issue. The level of data (school-specific) is internally owned. If media sensationalize an issue, it causes partners to walk away from the data. No scores on report cards – it would turn them off. We want to be better, not place blame, so we intentionally did not do that.

They closed by adding these thoughts: Right now, all of our work is based on data. How many kids does this effect, and who should be leading this effort. We have a priority list – criteria which all new projects must meet. Our role is a critical one in the community -- we don't do direct service, but we change the life of kids.



That gets us to lunchtime. Lunchtime on Thursday at the Community Indicators Consortium conference had a keynote speaker - Stephen Bezruchka, from the Departments of Health Services and Global Health, School of Public Health, at the University of Washington. He spoke about the kinds of indicators we should be measuring to reshape U.S. policy towards health, especially in terms of improving American life expectancy. I found this interview and transcript online where he covers many of the same points he did at lunch, so this should compensate for my inability to eat and take notes simultaneously.

After lunch, I heard Tad Long from NewCity Morehead speak on measuring civic engagement (I already pointed to his presentation online, so I won't add much here.) Following the same theme of Community Engagement and Mobilization, Sandra Noel and Mary-Louise Vanderlee from Niagara spoke of how they used the children's rights framework to engage their community around measurement and action. They will be releasing their work to coincide with National Child Day on November 20, so I'll hold off on describing more about the project until I can share the results.

The last session on Thursday that I attended focused on New Tools for Data Visualization. Scott Gilkeson, from State of the USA, presented on Data Visualization Fast and Cheap. His goal is to show ways to put information and data up on the web. Many Eyes, from IBM, is a few years old. You can upload data to the web and then visualize it in one of 18 data visualization opportunities. Then you can copy the code to your webpage and show the visualization there. Let's say you have downloaded data from the Center for Medical Statistics website on national health expenditures. The data has to be configured in an Excel spreadsheet to match many-eyes conventions. Then you can log in to Many Eyes (registration is required, but it's free) and paste in the data. The site will show you that it understands the data you have entered, and prompts you to enter in the title, source, tags, description, and other information about the data. Select the data visualization type and the graph appears. You can then grab the embed code and put it on your own site or blog. All of the interactivity is available on your site.

Google has also made available data visualization tools. Uploading your information into Google docs, and again arranging the spreadsheet according to Google conventions, you can insert a Gadget. This allows you to choose a chart type. After formatting the chart the way you want it, you can publish the gadget, and it will give you code that you can publish online. Again, registration is required, and it's free. This process is limited because the metadata information doesn't automatically move with the graph.

However, you can use Google's API to create your own customized code. Scott said that they've just finished an open source product based on Google gadgets that would allow you to create a full metachart. He'll be putting the information together in a clearer fashion and making the code available shortly for all Google docs/gadgets users. (It should be at the Google Code projects page shortly,under "metacharts" --I'll link directly when it's available.)

Alex Baumann from University of Massachusetts-Lowell spoke next about An Open Source Resource for Data and Indicators. Seven founding members, complex entities, came together in an Open Indicators Consortium. We wanted to get high performance/large dataset visualization tools available for people to use.

This is an agile development process – updates released regularly to members, and many will be releasing their results next week. Want to make this a good, robust open source product free for nonprofits.

Second year they'll add personalization, collaborative visual tools, integrated voicechat, flexible configuration, controlled/secure data access, and ontology/middleware to allow comparisons between OIC member and National Data Commons sites.

What followed was a series of demos, which were fun but describing them is hard so I won't. It was good stuff. It's a work in progress – will be in three levels, novice, middle, advanced, right now working on advanced and then will create novice-level later with fewer features. The product features multiple layers, different shape files, animated probing. You can click on data outliers and see it on google maps, wikipedia, etc. Mouseover and get the data and name. Right-click and you can search for data in google. Data is downloaded on demand -- they want to be able to scale to very large data sets/detailed geography. Tools right now can be embedded in a website, and are working on being able to embed a specific exploration onto a page.

Question: Will you build api's to connect to the data sources? Right now data stored in databases so you can do complex queries and scale the data-- have to load data in and tag it with metadata. Have to be able to link your data to a geography. It uses compressed shapefiles – stream in detail as you zoom in.

John Bartholomew, from GeoWise, presented on InstantAtlas: Interactive Indicator Presentation in Maps, Charts, and Tables for the Web. He began by showing some of the kinds of ways people are using interactive mapping as part of their data display/sharing efforts, and then went to demos.

Business case: more powerful open source and commerical graphic and mapping tools to help engage commitment to priority community issues. Mapping and data can sensitize policy makers to priority needs and empower local communities over local issues.

Challenge: scarce skilled resources in the public sector. Data in government is presented primarily in static formats. Restrictive Itpolicies present hurdlesto adopt new reporting media.

Samples of interactive mapping. Want to provoke discussion, not a sales pitch. Sometimes single platform is best, sometimes combination is better.

Healthmap – implementation on google map background. Easy to see, hard to quantify data.
Rhiza labs – H1N1 tracking – user-contributed data then mapped.

Heat maps – pioneered in Scandinavia – hard to allocate resources on blurred contours. Make sure visualization serves intended purpose.

Statistical relationships between indicators – how do you do it? Circles on colored backgrounds is the way we used to do it-- is it always the best?

With microsoft comes powerful flex api implementations, but require skilled developers.

WHO – using multiple tools/platforms to get information to help countries withoutsophisticated technical resources?

Good practices for mapping include:

  • Ease for audience to grasp
  • Intuitive interactivity
  • Audience-appropriate
  • Design focused on promoting valid, evidenced-based conclusions

(Those four points are critical. My two cents: DON'T FORGET WHO YOUR AUDIENCE IS! We go overboard sometimes creating stuff that intimidates and confuses rather than invites and informs.)

He went to demos, which are available on their website. One demo I would like you to check out is this one: JCCI (click on Community Snapshot to see more). That's the site I just launched last week using InstantAtlas technology.

After that session was a social/reception, then we wandered off for sushi and more conversation. All in all, a really good beginning to the conference.




Read more ...

Thursday, October 1, 2009

Mad Props to Tad Long!

I'm at the 2009 Community Indicators Consortium Conference in Bellevue, Washington. I'll be sharing my notes on the sessions shortly.

But I had to applaud Tad Long from NewCity Morehead for concluding his presentation by going to his website and showing how his presentation, report, and template were online and waiting for us, collected on a page for conference attendees.

Two quick thoughts from his presentation:

  • measuring results has to go beyond usual ways we measure
  • engaging citizens needs to go beyond the traditional ways we have brought people together

www.newcities.org

I'm going to be spending more time comparing their community engagement scorecard with the Indicators of Civic Health project we did with the National Civic League back in 2002. I like how many of the same themes have been develloped further.

Way to represent!

Read more ...