Community Indicators for Your Community

Real, lasting community change is built around knowing where you are, where you want to be, and whether your efforts are making a difference. Indicators are a necessary ingredient for sustainable change. And the process of selecting community indicators -- who chooses, how they choose, what they choose -- is as important as the data you select.

The Jacksonville Community Council (JCCI) understands indicators and community change, with more than 25 years of producing the annual Quality of Life Progress Report for Jacksonville and the Northeast Florida region, and two decades of helping other communities develop their own sustainable indicators projects. JCCI consultants give you the information you need to measure progress, identify priorities for action, and assess results.

I'd like to talk with you personally about how we can help. E-mail me at
ben@jcci.org, call (904) 396-3052, or visit CommunityWorks for more information. From San Antonio to Siberia, we're ready and willing to assist.


Monday, June 30, 2008

Corporate Sustainability Indicator Reports: Case Study

Here's an interesting analysis in Ethical Corporation newsmagazine that takes Nestlé's "Creating Shared Value" report and examines it according to sustainability practices and indicators measured. Along the way, the author Aleksandra Dobkowski-Joy shows us the complexity in asking for large-scale corporations to engage in sustainability indicators reporting, as well as why it's important.

In Report Reviews: Nestlé’s 2007 Creating Shared Value report – Still lacking the winning formula, Dobkowski-Joy speaks positively of Nestlé's efforts, but challenges the organization to do more. She writes:

Nestlé’s list of material issues, for example, is broad enough to encompass
almost anything the company does – including manufacturing and environmental
footprint, people, agriculture and rural development, nutrition, and marketing
and communication. The company reports precious few quantitative goals and
targets. Nestlé flip-flops between its intention to develop clear financial,
environmental and social goals, and contradictory positions, saying it
“generally considers historical performance trends to be more revealing and
useful for future planning than setting individual forward targets”.

One of the interesting ways in which she praises the use of data is how efforts are put into a global context.

In discussing water extraction for bottled water, Nestlé accomplishes what few
reporters even attempt. Namely, the company quantifies the global ecosystem
impact of water use for production of bottled water at 0.0009 per cent of total
human water withdrawal. Though some may see such quantification as a defensive
ploy to deflect criticism, Nestlé in this instance has begun to get to the heart
of what sustainability reporting should really be: a discussion of how companies
can operate within the absolute limits of ecosystems and societies. Nestlé
should concentrate on this type of transparency and context in future reports.

Do businesses in your community publish CSV reports? Do they engage the community to help identify what should be measured? What experiences do you have in coordinating your community indicators efforts with business reporting?



Read more ...

Sunday, June 29, 2008

Community Health Profiles and Indicator Guides in England

I thought you might be interested in seeing the work done by the Association of Public Health Observatories in creating Health Profiles for every local council in England. One of the highlights of the site is their work on a Good Indicators Guide. I'll talk about the profiles first, and then the guide.

From the site:

Health Profiles provide a snapshot of health for each local council in England using key health indicators, which enables comparison locally, regionally and over time. They are designed to help local councils and the NHS decide where to target resources to tackle health inequalities in their local area.

To view an interactive version of the Health Profiles for England click to view an interactive map. This will allow you to select indicators and areas for comparison through a map or list.

The maps are provided at a regional, county, or district level. They also provide PDF versions of the profiles, which they point out may differ from the interactive maps.

The maps are straightforward and simple to use, and the data provided a decent sampling of information. Not being from England, I would have liked more contextual data to understand some of the demographics better. But tossing in some of the social determinant information and other extra bits of information was interesting -- carbon emissions in a health profile? Neat.

The Good Indicators Guide has a great Forward.

As leaders, we have a responsibility to know the essential data and information better than anyone else. We need our teams and organisations to be able to capture, interpret and communicate the essence of any situation in order to make the right decisions at the right time. The indicators we use and choose therefore need to be carefully designed to be practical, complete, valid and robust so we can concentrate on those areas that need further investigation. In short, we need to sort the wheat from the chaff in the information overload world we now live in. This short guide focuses on the key principles behind developing, understanding and using indicators. It is designed to be an essential and readable guide to those in senior positions who may not always feel entirely comfortable with this important area in healthcare leadership.

That same clarity and succinctness is found in sentences like "[W]e all love indicators when they seem to summarise and bring summary/simplicity, but not when they judge us, or something dear to us." The guide dives into "the anatomy of an indicator" and "four things to know about indicators." It's a must-have for anyone working with indicators -- and it's free to download!

Check it out!

Read more ...

Saturday, June 28, 2008

Conference Announcement: Data Users Conference 2008

Data Users Conference 2008: Linking the Health Information Chain
September 21-23, 2008 Fairmont Chateau Laurier, Ottawa, Ontario
Online registration is now available - Don’t miss our early-bird rate - Register today!
Conference information available here. This looks interesting for those working on health-related indicators. From their website:

Conference workshops
On Sunday, September 21, 2008, we will be offering the following pre-conference workshops:
  • The STC/CIHI Health Indicator Framework – how data are used to produce indicators, and finding what you need on the web
  • CIHI Portal: Empowering Health Care Organizations Across Canada
  • HSMR: Linking Hospital Standardized Mortality Ratio Data Analysis to Quality Improvement
  • Overview of Health Data at Statistics Canada
  • Data Mining for Data Quality—How to Assess the Measurement Error of Your Data?
  • An Introduction to CMG+: Grouping Methodology and Related Resource Indicators
  • Understanding ICD-10-CA and CCI for Analysis and Trending

The workshops will be available in the morning with a repeat presentation in the afternoon. Learn more!

Dates to remember
August 20, 2008: Deadline for reserving hotel guest room at the special conference rate. Get more information.
September 1, 2008: Deadline for early-bird registration. Learn more.

Further details about the conference program, including information on sessions and speakers, will be available by end of June.
* Bookmark this page so that you can check back often for all the latest news!

Read more ...

Friday, June 27, 2008

Sustainability Indicators in New Zealand

Check out this summary of a gathering in New Zealand to promote the use of sustainability indicators.

From Rendt Gorter:

The use of sustainability indicators to measure the ‘quality of life’ - Accounting for multiple perspectives

There is growing interest in sustainability indicators by a wide range of thinkers and practitioners. That much was evident from the introductions of participants at a one-day course convened by the Society for Sustainability Engineering and Science.

A need for better understanding about the application and reporting of indicators was stressed by council workers, the quality of indicators in environmental reporting was important to Ministry of Environment staff, researchers expressed interest in effective processes for developing indicators, and practising engineers and architects from the private sector talked of their role in improving sustainable practices and their encounters with divergent standards and languages of government partners. In response to such expectations the facilitators of the workshop – David Kettle and Dave Breuer of Anew NZ – presented an introduction to the complexities of using indicators for planning and administration of government policy. The workshop had been prepared to provide working knowledge of this subject and to stimulate debate. Sure enough, the discussion this generated showed some of the wider issues that the development of indicators must struggle with.

Read the rest of the article here.

Read more ...

Job Opening: Sustainable San Mateo

Project Coordinator - Sustainability Indicators Report
Sustainable San Mateo County, San Mateo, CA

Established in 1992, Sustainable San Mateo County (SSMC) is a nonprofit dedicated to the long-term health of our county’s economy, environment and social equity by educating about sustainability. More information about SSMC can be found at http://www.sustainablesanmateo.org/.

SSMC is a small office with 4 paid employees and over 50 dedicated volunteers. We are seeking a part-time contract professional to manage the Indicators Report project. Position available immediately.

Hours/Terms of Employment: Contract position without benefits. Contractor’s rate of pay $30 - $40 per hour depending on experience with hours not to exceed 800. Term from present to June 30, 2008 with most hours worked from October, 2008 to April, 2009.

Responsibilities:The Indicators Project Coordinator is responsible for coordinating the Sustainable San Mateo County 2009 “Indicators for a Sustainable San Mateo County” report card, also known as the Indicators Report. The Coordinator/Editor will also be responsible for ensuring that all elements are written in a consistent style that adheres to generally accepted style guides.

Specific duties include:

• Project management, including recruiting volunteers, preparing formatting and research guidelines for volunteer researchers, coordination of indicators researchers’ submissions and verifying sources and correct uses of information.

• Editing each indicator and all other parts of the report before submitting for review by Indicators Committee.

• Research, write and edit portions of the report including the Introduction and Summary, and sections prepared by volunteers. Review and edit the City Reports which are submitted by the cities.

• Work with SSMC staff to format and upload indicators report onto SSMC website.

• Work with the Charts editor and Graphic Designer to ensure consistent, clear and attractive layout.

• Update Research Guides for 2010 Report.

• Participate in the Indicators Committee and other meetings as requested.

• Identify refinements, additional indicators that would enhance value, and implement refinements as appropriate.

The Coordinator/Editor will act as project leader under the guidance and support of the Indicators Committee, ensuring that report development progresses from inception to completion, and that actions are taken to mitigate any obstacles to completion. The overall goal for the Coordinator is to meet the project deadline and to deliver a high-quality report for publication.

Skills Experience:

• Bachelor degree in related field required, Masters preferred.

• Minimum 5 years professional experience. • Demonstrated ability to lead and coordinate the activities of large groups of people, experience in managing volunteers a plus.

• Experience developing publication of similar scope from inception to completion.

• Writing and/or editing experience within a publishing setting.

• Familiarity with style guides, and understanding of style editing, or an ability to assimilate such information independently.

• Ability to research and write sections of the report.

• Understanding and familiarity with sustainability issues, specifically as they relate to San Mateo County and generally to the San Francisco Bay Area.

• Expertise in the use of MS Word and Excel.

To apply, send personal cover letter and resume.

See this link for more information.

Read more ...

Thursday, June 26, 2008

Orange County Indicators and Homelessness

The 2008 Orange County Community Indicators (pdf) report has been released, and there's some interesting information in it. (I like the display titles at the top of each page/section -- great way to get to the point quickly!)

Here's an interesting update from the Poverty in the OC blog. Looking at the 2008 Orange County Community Indicators (pdf) report, Keith Giles writes:

Orange County is second only to Los Angeles for the largest number of homeless people in the State of California.

The difference is that homelessness in LA County is caused by drug addiction, mental illness or substance abuse related factors.

In Orange County, out of our 35,000 homeless, 80% of them were forced into this lifestyle because of the lack of affordable housing and rent controls.

Think about that. 80% of the people in Orange County who are currently homeless could be living in an apartment anywhere else in the Nation. It's only because of the cost of housing here that they remain homeless.

We make our own homeless.

This means we can also un-make them.

What follows is a further discussion of the data to make the point. Check it out!

Read more ...

GASB Seeks Comments on SEA Exposure Draft

GASB Seeks Comments on SEA Exposure Draft

Many of you have seen Jay Fountain or Paul Epstein present on the really good work GASB (the Governmental Accounting Standards Board) has done on creating suggestions for more effective performance reporting. You can see their work here -- I particularly recommend Government Service Efforts and Accomplishments Performance Reports: A Guide to Understanding, which needs a better title but it's still a useful resource.

Anyway, GASB is looking for comments on revisions to one of their reports, so I thought I'd pass on the information and the links. The good news is that the part about seeking citizen input in identifying government performance measures is still in there, and (if I read it correctly) has been moved closer to the front.

Here's the announcement:

The Governmental Accounting Standards Board (GASB) recently released an Exposure Draft on concepts related to Service Efforts and Accomplishments Reporting, an amendment of GASB Concepts Statement No. 2. GASB is seeking comments to its exposure draft by July 3, 2008. Written comments should be addressed to: Director of Research and Technical Activities Project 20-1.

The document is available for download from the GASB website. In addition, GASB will hold a public hearing on July 29, during AGA's Professional Development Conference & Exposition in Atlanta.

GASB proposes to modify four sections of Concepts Statement No. 2 and eliminate one section, Developing Reporting Standards for SEA Information. GASB is also developing principles-based suggested guidelines for voluntary reporting of SEA performance information.

(Hat tip: AGA's Perspectives on Performance www.agacgfm.org/publications/html/pop062508.aspx)

Read more ...

Community Indicators Consortium Conference

I'm traveling for the next two weeks, so was unable to attend this year's Community Indicators Consortium conference. If you're there, hello! And I'm sorry I missed you.

If you're interested, drop me a line and tell me how things are going -- what was the best session you attended, what you learned, how you liked the keynote speakers. If you'd like to write a longer message, e-mail me (through the profile here) and I'll post your entry on the blog and give you credit.

Hope you're all having a great time!

(Use the comment section below or my e-mail to send feedback.)

Read more ...

Race Matters

As part of our continuing conversation about indicators of racial disparities, I'd like to draw your attention to some good work by the Annie E. Casey Foundation (from whom we get KIDS COUNT.) The specific project is called Race Matters, and they describe it as a
"toolkit is designed to help decision-makers, advocates, and elected officials get better results in their work by providing equitable opportunities for all. The approach described in the toolkit deals specifically with policies and
practices that contribute to inequitable outcomes for children, families, and
communities. The toolkit presents a specific point of view on addressing unequal opportunities by race and simple, results-oriented steps to help you achieve your goals. The following tools are designed to help you make the case, shape the message, and do the work."

A key component of the work is providing data to support your efforts (pdf). They have some interesting counsel on how to present data on race-based indicators that's worth discussing. (Warning: Long post follows!)

The goal of the workdook on data is straightforward.
"The guide hopes to assist you to produce data presentations that more
intentionally speak to the circumstances of all children. By lifting up ways in
which racial inequities shape opportunities differently, and identifying how to
remove barriers to opportunity, your data will be a resource that speaks more
clearly for everyone."
How do we do that? Here's what they suggest:
  • Select your indicators carefully. They point out that it's "easier to change what you measure than change what you don't." In particular, they advise using structural indicators over individual indicators. They prefer system-oriented indicators over people-oriented indicators, for at least two reasons. First, because structurally-oriented indicators lend themselves to structurally-oriented solutions, and the framework they're using is that of structural racism. Secondly, they fear a "blame the victim" approach where indicators are used to reinforce stereotypes or look to individual behavior changes to address needs. When the impacts on individuals need to be shown, then the framework or structure of the data presentation should lend itself easily to a structural interpretation.

    I found this conversation quite interesting. Every effort to measure indicators relies on inherent cultural assumptions and civic biases, as pointed out by David Swain at the recent conference of the National Association of Planning Councils. Race Matters is clear about where its assumptions and values frameworks are, which is important -- you can't do this work without making explicit what your assumptions are. However, I suspect the issue is bigger than that -- you can't measure indicators of racial disparities without serious community work and reflection about your role in the community and your organizational relationship to the issue at hand.

    In a recent effort to describe the development of our Race Relations Progress Report in Jacksonville, I wrote about an earlier attempt to create such an indicators report. Several factors resulted in that report not happening, but the relevant lesson is here:

JCCI had operated under the assumption that its experience with measuring
and presenting data objectively was sufficient to tackle this new
initiative. Instead, JCCI had to develop the community trust and cultural
competency necessary to create a community indicators report specifically
focused on the experience of a specific racial or ethnic group. The vision for
the report had to come through a different, open community process and be shaped from within the target community in order to combat “skepticism about the benevolent intentions of the larger (white) Jacksonville community in relation to African-Americans’ quality of life” and “distrust of information generated outside the African-American community.”Measuring the African-American experience against the broader community vision would have meant evaluating what it meant to be black in Jacksonville against an external standard; before JCCI could re-address the issue of race-based community indicators, the organization had to revisit its core assumptions about community indicators.

I suspect that even the most well-intentioned framework can't overcome the need for trust-building and cultural-competency work around issues of race and ethnicity. And given trust and cultural competency, the framework may well be less important (or modified to fit local needs). The assumption that presenting data in a certain way reinforces stereotypes is, in itself, somewhat of a stereotype, after all.

  • Choose asset-focused indicators whenever possible. While deficit-focused data can be "dramatic and alarming", Race Matters points out, and mobilize action, broader support is available with asset-based data. Assets-focused indicators, they suggest, "make it easier not to stigmatize families or individualize explanations for shortcomings." They
    are more likely to be "aspirational," while the steady drumbeat of negative news may serve to de-motivate the community.

    While I tend to agree in general on asset-based indicators, sometimes the results can get silly. We've tried publishing "employment rates" (rather than unemployment rates) or "percentages of babies born at healthy birth weights" (rather than low-birthweight infants.) Two things happen. People who understand data have to do the math in their heads to get the numbers that they're familiar with. And the result may seem like unnecessary contortions to avoid facing unpleasant news.

    When we released the first Race Relations Progress Report in Jacksonville, some members of the (white) community feared unrest in the streets if the negative data became publicized. The truth was very different. People in our African-American community already knew how bad things were. In general, it was the white community that was surprised, while the black community was surprised that they were surprised.

    Asset-based is fine. Just don't try to sugar-coat unpleasant truths. We have found that most people (white and black) have significant misperceptions about the extent of racial disparities in the community, and that the only way to address them honestly is to present the plain truth. You can't solve a problem until you admit it exists. And attempts to make it seem "not that bad" miss the point.
  • Selections of graphics, photos, and quotes. Race Matters advises you to use illustrations carefully, and "avoid imagery that moblizes stereotypes, such as picturing a child of color (no matter how precious looking that child is) beside a deficit-focused indicator." Not a bad suggestion, but be careful not to get too cutesy in making every photo appear staged. In Jacksonville, we avoided photos altogether for the first report, since the concern over image was starting to obscure the substance.

    In later efforts, like our targeted Infant Mortality study addressing racial disparities in infant deaths, photos were used but chosen with care. Good advice here.
  • Organization of the text. "The sequence of data matters," says Race Matters. How you structure the report helps tell the story you're intending, and indicators of race, income, and place tend to be "over-arching" indicators they should go first. As our Beyond the Talk: Improving Race Relations study put it, "Beyond particular factors related to personal prejudice, institutional practices, or individual choices, the pervasive effects of disparities in education and income mutually reinforce one another and deepen all other disparities." Because of that, we place those two set of indicators right up front.
  • Opening essay/letter to the reader. Race Matters recommends the letter to the reader found in the 2007 Data Book from Kentucky Youth Advocates (http://www.kyyouth.org/) as a way of setting the tone for the report. The summary/introduction to the report is critical. We have the content of that letter developed by our citizen review panel as they share what they think the most important take-aways are from the data presented. What does this all mean? Frame the data up front, so people will understand what they read as they approach the data. Again, be explicit in your values and purposes approaching the work, and the reader won't have to guess your motivation while reading the data presented. (Our key driver for the first report was to present inarguable information that could not be misinterpreted. That's not an easy aspiration.)
  • Consistent disaggregation. Race Matters points out, and I echo heartily, that where at all possible disaggregate your data by race and ethnicity the same way from indicator to indicator -- and when you can't, explain why! People get offended if they feel like their particular information was withheld or deemed less important than someone else's, and a simple explanation up front keeps you from months of conversations afterwards. Trust me on this one.
  • Deep analysis of disparities within a structural frame. Race Matters says, "the danger of not offering structural explanations as a frame for disparities is that this can mobilize readers' default prejudices about individual- and group-based explanations for inequities." I'll add that the danger of framing the indicators with only structural explanations is that you move far beyond a factual report to an advocacy position, and not all data support that position. A report that isn't willing to consider institutional, individual, and internalized racism as three interdependent components within a collusive system misses an opportunity to stand as the bedrock for community conversations about how to address racial disparities.

    If your report is to move a particular agenda, then by all means structure the data to support that agenda. But be prepared to have your report dismissed out of hand by anyone who does not support your policies. To have a report that transforms the community conversation, present the facts unabashedly without trying to explain why. Don't let your point of view interfere with someone coming to grips with unpleasant truth! The problem in communities tends to be (1) misperceptions about reality and (2) a failure to face that reality. Shared reality-based understanding of the problem is a precondition to shared action, and you can't leapfrog that process. To quote me again:
Generalizing Jacksonville’s experience suggests this: because racial disparities are widespread and significant, understanding them is a necessary first step for any effort in community improvement. Communities that bypass these important measures generally fail to understand, plan for, or address underlying fractures in the foundation of their community, and the efforts are usually not successful because of this. On the other hand, the implications of developing a shared understanding of actual racial disparities in the community are staggering; if lack of progress and arguments about public policy are rooted in misperceptions, reaching a shared, reality-based perception of the problem moves the community much closer to finding solutions. This was the experience in Jacksonville: tangible progress and shifts in public policy became not just possible, but inevitable, with a shared understanding of the problem.
  • Recognition of cultural variation in indicator applicability. Race Matters makes an important point here -- not all numbers mean the same thing in different cultural settings. Be careful. Here's another place where cultural competency is a prerequisite for quality work.
  • Need for solutions bundled with problem description. This depends on what you're trying to accomplish. If you're moving an agenda, then yes; if you're measuring community progress and want to take the role of independent, trusted data source, then by all means NO. In my opinion, the best indicator reports are descriptive, not prescriptive. Prescriptive reports get written backwards -- the recommendations drive the selection of data. Descriptive reports form a shared basis of understanding that allow the community to decide together what to do about the problem. (That shared understanding usually leads to a consensus for action, by the way. See Beyond the Talk for an incredible example.)
  • Getting "picky" about words and charts. I heartily agree. Be very, very careful what words you use. Have people read your report prior to publication and search for opportunities to misunderstand you. If it can be misinterpreted, it will be.

I'm interested in your feedback as well. Overall, I really appreciate this effort from Annie E. Casey. Their mission and my organization differ slightly, so some of their recommendations don't fit our work -- but what they've put together is an incredibly important addition to the field. I highly recommend it as a starting point for your community initiative.

Your reactions?


Read more ...

Wednesday, June 25, 2008

Community Indicators Reporting: Beyond the Numbers

Last night I had an interesting conversation with someone from the Quad Cities area (Bettendorf and Davenport, Iowa, and Moline and Rock Island, Illinois) about their community indicators report, and it prompted a train of thought I thought I'd share with all of you.

Their report -- the 2007 Quad Cities Community Vitality Snapshot (pdf) -- is an attractive, welcoming piece. The data are presented in a way that is clear, and they avoid the overabundance of charts, graphs, and statistical jumbles that get in the way of telling the story.

But data aren't enough, on their own, to tell the story. Facts need context. Trends provide direction and movement. Visions provide intended destinations. Context, direction, and relationship to intended destination start shaping a story that moves beyond a snapshot into an understanding of where progress is being made -- and where it's not.

We create indicator reports for a number of reasons. Some years ago, the Jacksonville Community Council Inc. listed their reasons why community indicators were important (and how they should be used):

  • To produce an annual report card on community progress
  • To serve as a planning tool for government and private institutions
  • To educate the residents about their community and the factors they consider important to their Quality of Life
  • To increase awareness of the many components of progress and their interrelatedness, the connections between people and their environment
  • To highlight community success stories and give credit for work well done
  • To identify areas of decline or concern where community action is needed
  • To help focus community resources and efforts in the areas of highest priority
  • To encourage residents to take an active part in addressing community problems
  • To promote accountability of local governmentTo stimulate new and better ways of measuring progress
Each of those statements imply intended audiences. For an indicators report to do all of these things, its presentation is at least as important as the information. (See the conversation that began with this blog post.)

How do you tell your community's story in your indicators report? Which reports do you look at as models? Share your ideas and links below!

Read more ...

Tuesday, June 24, 2008

Index and Rankings

As many of you know, Mercer's 2008 Quality of Living Survey was released recently, along with a ranking of the major cities of the world. I'm not a big fan of either indexes or rankings, and was pleased to see I wasn't alone in that thought. Robert Kerr from Melbourne, Australia (Go Essendon Bombers!) had this to say after Melbourne ranked the 17th most liveable city in the world:

While we might express our civic pride and incredulity that Melbourne comes in below Sydney, the more important question to ask is whether or not these surveys tell us anything useful.

Is Melbourne really a better place to live than all but one city in the world? Or not as good as 16 others?

Indicators that reduce the concept of liveability to a single number are unlikely to shed much light on just how liveable a place is.

Instead, Victoria's liveability is best judged with a comprehensive set of indicators that reflect what's important to Victorians - not international pollsters.


So what makes a place liveable? The answer is likely to be different for everyone.

What do you think? Do you use a composite index in your community indicators report? Do you rank your community against other comparable cities? If so, why? If not, why not?

Read more ...

Interesting New Indicators Efforts

Several new community indicators efforts are in the works, and I wanted to draw your attention to them. First, I want to tell you about a report just released you may want to look at.

The Redmond Community Indicators Report available at Redmond 2022 is a nice example of conveying interesting information simply -- the number of trees is a nice touch. I particularly like their layout of "Go Figure" -- a quick facts-at-a-glance report -- with three columns: How much/many, Of what, and Trend. Some of the new efforts out there may want to take a look at what they've done.

Northwest Arkansas is developing a community indicators report, and have taken an interesting tack. They've identified a sizable population from the Marshall Islands as part of their community, and are trying to engage them in the process. The project is moving forward with a recent community meeting to identify indicators. They've got a nice mix of partners supporting the effort -- looking forward to good things when the report is released at the end of the summer.

Baton Rouge, Louisiana is moving forward with their CityStats project (doc). More about that project is available here. They had the community identify indicators and are now in the research stage prior to publication -- watch for this one as well!


Montgomery County, Maryland is creating a community indicators document as well, and they're asking for community input via the internet. Here's their announcement:


You are invited to participate in an exciting environmental planning project - jointly sponsored by the Montgomery County Planning Board and the County Executive. Called the Healthy and Sustainable Communities initiative, county staff is crafting environmental policy goals and indicators that measure their progress - with your help! While county environmental programs do a great deal toward improving quality of life in Montgomery County, County Executive Ike Leggett and Planning Board Chairman Royce Hanson have identified a missing link: a set of goals and corresponding measures of progress to guide decision making.

Join them at their kick-off workshop June 25-26 at the Universities at Shady Grove. At the workshop, they want participants like you to help them chart our progress toward meeting sustainability goals.
RSVP View draft schedule of events

They have kicked off our project as a virtual document. Give your feedback on how they should measure our environmental progress, even if you can't join the workshop.
Log on to contribute your thoughts to any of the following indicator reports. Simply click on a goal and enter your comments. Visit often to view what others say. And please spread the word!

1. Climate protection
a.
Energy use
b.
Carbon emissions
c.
Waste management
2. Clean air
a.
Air quality index
b. Travel indicators
3.
Clean water
4. Wildlife habitat and open space
5. Smart communities
6.
Healthy people
7. Green economy
a.
Jobs
b.
Agriculture
8.
Environmental justice

Exciting work all around. Please keep us informed of your new community indicator projects and your report releases. Thanks!

Read more ...

Monday, June 23, 2008

Job Opening: MCIC President

METROPOLITAN CHICAGO INFORMATION CENTER (http://www.mcic.org)
Chicago, IL

PRESIDENT
The Metropolitan Chicago Information Center (MCIC) was founded in 1990 by a consortium of business and philanthropic leaders at the Commercial Club of Chicago. As an independent, nonprofit research and consulting resource, it provides information and insight to enhance program and planning decisions made by civic, social service, and philanthropic organizations and individuals working to improve social conditions and quality of life.

The President will be the visible leader in the organization’s research endeavors. S/he will be the external/public face of MCIC and will report to the Board of Directors and work cooperatively with MCIC’s Executive Director who also reports to the Board. Skills and experience including leadership, superior analytical skills, enthusiasm, mentoring, marketing, charisma, market presence and organization building will characterize the successful candidate. The candidate needs to be a thought leader who has an expansive vision of the range of issues impacting the quality of life in the Metropolitan Chicago area, individual communities and neighborhoods. S/he will need the capability to sell, develop and implement research that illuminates discussion, aids in the formulation of social policy and provides insights not previously or intuitively obvious. As a mentor, the candidate will regularly provide knowledge and skills transfer to the MCIC staff.

The ideal candidate will have a Ph.D. in research or a social science field and 10 years of research experience. S/he will have knowledge of the Chicago area’s significant social, economic and political issues. Evidence of an ability in new business development/grant acquisition and relationship management which demonstrates a platform for successful client development at MCIC will be expected. Previous documented project management, consulting, and supervisory skills are required as are excellent communication skills. S/he will have an enthusiasm to teach and several years of experience doing so. Mastery of PC-based statistical, analytical, database, spreadsheet, financial, and other software is a necessity. All candidates will need to have the ability to combine their experience with a desire to learn about and become immersed in the culture of MCIC in a timely fashion. MCIC recognizes that each applicant for the President position will bring a varied portfolio of these skills and experience and that they will be differently weighted in each case.


To apply for this position please see the Position Guide at http://www.kittleman.net/jobsDetail.php?_page=jobs&id=74, then e-mail your resume with your references and a brief letter focusing on your previous relevant experience to Kittleman & Associates at mcic-pres@kittleman.net

For additional information, please contact Nick Goodban at 312-265-5444.

Read more ...

Call for Sessions: ACHI

Call for Sessions due August 8: Association for Community Health Improvement

The Association for Community Health Improvement (ACHI) is accepting proposals through August 8 for concurrent breakout sessions for its March 11-13, 2009 national conference in Los Angeles, California. Conference session will address: community health assessment; community benefit; health in the social and built environment; and building the skills of community health leaders.

This event features more than 450 professionals from hospitals, health systems, foundations, public health, and community health organizations in a gathering that stimulates real change and improvement in how community health programs are planned, delivered, and assessed. Visit the conference Web site (www.communityhlth.org) for information on topic tracks and submission guidelines. Write to communityhlth@aha.org with questions.

Read more ...

Wednesday, June 18, 2008

DataPlace Launches New Beta Site

I thought some of you might be interested in this announcement from DataPlace:

DataPlaceTM Introduces New Beta Site for Exploring and Analyzing Neighborhood Data at beta.dataplace.org

For Four Years DataPlace Has Offered Free Statistical Tools for People Interested in Housing Data (i.e., Subprime Lending and Loan Denials), and General Demographic, HUD, and IRS Indicators. DataPlace Now Rolls Out New Strategies To Help People and Organizations Acquire Knowledge from Statistical Information

SAN JOSE, CA, June 17, 2008 - DataPlaceTM is a unique, one-stop, online source for comprehensive housing and demographic data. The DataPlace Web site offers free and easy access to many national datasets and thousands of indicators with statistics on mortgage lending, population, income, housing, education, and federal expenditures. DataPlace's powerful geospatial platform allows the user to evaluate housing conditions and demographic composition of neighborhoods, cities, counties, states, zip codes and census tracts across the nation by creating maps, statistical profiles, charts, and rankings. DataPlace is now releasing an advanced, beta version of the site for public view. The free beta site includes several new features:

  • Side-by-Side Maps to compare two indicators at the same time, or see the same indicator change over time
  • A powerful Ranked List generator making it a snap to find places of interest for any place & indicator combination.
  • Simplified Search capabilities - now it's easier than ever to find relevant data
  • Preview Tools to provide quick snapshot views of data and locations
  • Visualization Tools, including easy-to-read histograms, offering unprecedented ease of understanding into the comparability of local statistics.

Put DataPlace to Use for Your Data and Communities...Become a DataPlace Customer
DataPlace is providing its innovative geospatial platform to outside organizations through customized license agreements. Customers will be able to quickly and easily replicate any of the tools and/or data currently on Data Place's geospatial platform for their own data, communities, or audiences. Customers can upload data, conduct analyses, draw comparisons and create tailor-made charts, maps, and other graphic presentations tailored to their own audience. Benefits to customers include:

  • Customizable and a fully hosted geospatial Web site in the customers brand and URL
  • Leveraging DataPlace's multi-million dollar platform for pennies on the dollar, avoiding the extensive resources required for customers to develop their own geographic information service (GIS) expertise
  • Packages suitable for internal secure and private research or external outreach
  • User-friendly, intuitive tools. Easily create high quality maps, chart, presentation, and reports
  • Focusing on content/data, not technology

----------------------------
About KnowledgePlex, Inc.
DataPlace is owned and operated by KnowledgePlex, Inc. KnowledgePlex is a not-for-profit organization that seeks to help transform disadvantaged communities and neighborhoods by creating and maintaining an innovative technology platform for information and data to serve a wide variety of sectors. Our goal is to provide a platform that allows the user to focus on developing data, information, and story, rather than IT and software. By providing information and technology that will empower communities and neighborhoods, KnowledgePlex, Inc. fosters accountability and impact-tracking at the community level. The best way to understand how DataPlace will benefit your organization is to see it in action. Call KnowledgePlex, Inc. at (866) 441-9249 to arrange a brief demonstration. To explore online visit http://www.kplex.org/

Read more ...

Monday, June 16, 2008

2008 KIDS COUNT Released

National trends in child well-being have improved slightly since 2000, according to the 19th annual KIDS COUNT Data Book, but these trends are not on par with the improvements that were seen at the end of the 1990s. The Data Book, released on June 12, 2008, is a national and state-by-state profile of the status of America's children. In addition to tracking 10 indicators of overall child well-being, this year's Data Book features an essay highlighting the urgent need to reform America's juvenile justice system.

Read more ...

Friday, June 6, 2008

"Baby Got Stats"?

I apologize in advance. See Chris Blattman's Blog for a rap song about statistics.

Then go wash your ears out with Mozart.

Read more ...

Thursday, June 5, 2008

Benchmarking Central Ohio 2008

The folks at Community Research Partners in Central Ohio have been busy, and their work is being recognized in the local media. I thought I'd pass along links to their report as well as to the coverage it has received. (If you're in one of the 15 other metro areas benchmarked in their report, you'll appreciate the research done for your community as well! I know we do in Jacksonville -- thanks, y'all!)

Here's the information you'll need, from http://communityresearchpartners.org/14651.cfm?action=detail&id=119#media:

Benchmarking Central Ohio 2008
The Columbus Partnership

On April 2, 2008 Benchmarking Central Ohio 2008 was released at a forum at the Columbus Metropolitan Club. The report assesses how the 8-county Columbus metropolitan area is doing, in comparison to 15 other metro areas, using a panel of 60 diverse indicators. The indicators focus on four broad areas—population vitality, economic strength, personal prosperity, and community well-being—each of which describes a facet of the community that contributes to economic competitiveness. The research was commissioned by The Columbus Partnership, a CEO organization of 30 top business and community leaders in central Ohio whose mission is to improve the economy of central Ohio and be a catalyst for growth in the region.

The 2008 report represents the second year of the project. Although two years do not represent a definitive trend, this report provides the latest data available and builds the foundation for tracking trends over time. An objective of the 2008 report was to keep the content and format as stable as possible to allow comparisons with the 2007 data; however, some changes were made based on feedback from the Benchmarking Advisory Group and data availability. The following are new in the 2008 report:

Patterns across Indicators: This matrix at the beginning of each section lines up the metro areas based on their ranking on a key indicator and shows other indicators that have similar rankings to that key indicator. For example: What is the profile of high growth metro areas, compared to slow growth metro areas?

Columbus trend chart: For indicators where two years of data are available, a new Columbus Trend chart has been added to the indicator page.

National context data: Each of the indicator bar graphs now has a new bar to show the data for the U.S., all metro areas, or other relevant basis of comparison.

New and revised indicators: There have been selective indicator changes (some additions, deletions, and modifications). These are explained in Appendix A.

Data source changes: There are some changes in data definitions, methodologies, and sources that may impact comparisons between the 2007 and 2008 reports. These are explained in the indicator definition and Appendix A of the 2008 report.

Read the 2007 report here.

Benchmarking Central Ohio 2008 PRESS COVERAGE:

This second annual report prepared for the Partnership by CRP was widely discussed in front-page stories, editorials, and letters to the editor in the mainstream media, in alternative publications, and on blogs and electronic forums. One highly charged topic was whether central Ohio “lacks culture” because it ranked last among the 16 metro areas in the study on arts establishments per capita. The accuracy of this ranking was widely debated. The research served as a catalyst for community dialogue, which furthered CRP’s mission to use data for community change.

Articles:
Data from Central Ohio Tell Two Tales,Columbus Dispatch
Grim reality: Perception is that city lacks in arts, Columbus Dispatch
Report: Columbus's growth, progress has ups, downs, Columbus Business First
Benchmarking Report Ranks Central Ohio in Top Tier for Key Workforce Indicators, TechWeek

Editorials:
Editorial, Columbus Business First
Letters to the editor, Columbus Dispatch

Blogs:
UrbanOhio.com
Notes From the Reserve
The Urbanophile

Radio:
How Does Columbus Measure Up?, WOSU's Open Line with Fred Andrle

Publications
Benchmarking Central Ohio 2008
Benchmarking Central Ohio 2008 Highlights
Benchmarking Central Ohio 2008 Press Release

Read more ...

Wednesday, June 4, 2008

Education Week Report Release

News from Education Week:


It’s here! The Diplomas Count 2008: School to College has arrived. This third annual report, with support from the Bill & Melinda Gates Foundation, is now available online at edweek.org. During our edweek.org Open House, you can access the whole report for FREE!


The report explores the rapid growth of state-level P-16 councils and how they seek to create a more seamless schooling continuum that prepares students from preschool through college and beyond for life, work, and further education. While you’re at it, be sure to check out our other benchmark reports, Quality Counts and Technology Counts.

Some things you shouldn’t miss in this year’s Diplomas Count:

Interactive media that will allow you to review:

Feature stories and analyses such as:

Our data tables and charts for historical graduation rates, ranked graduation rates for the 50 largest school districts, and the projection of graduates and non-graduates for 2008. You can download these right now and refer to them at your convenience.

Remember, our doors will be wide open through June 10. That means you’ll have access to everything our premium subscribers see daily for only a week!

If you like what you see during the edweek.org open house, get even more by adding a 4-week trial subscription to Education Week. This trial offer of 4 weeks of print and online access is available for only a short time at: http://e-news.edweek.org/ct/3007413:3306394050:m:1:219602297:7973631C425A23950A49BD66024F1786.
To order extra copies of Diplomas Counts 2008, simply go online or call 800-788-5692.
Tell your colleagues about the complete, FREE access to Diplomas Count and our Open House!

See you there,

Virginia B. Edwards
Editor and Publisher

Read more ...