Community Indicators for Your Community

Real, lasting community change is built around knowing where you are, where you want to be, and whether your efforts are making a difference. Indicators are a necessary ingredient for sustainable change. And the process of selecting community indicators -- who chooses, how they choose, what they choose -- is as important as the data you select.

The Jacksonville Community Council (JCCI) understands indicators and community change, with more than 25 years of producing the annual Quality of Life Progress Report for Jacksonville and the Northeast Florida region, and two decades of helping other communities develop their own sustainable indicators projects. JCCI consultants give you the information you need to measure progress, identify priorities for action, and assess results.

I'd like to talk with you personally about how we can help. E-mail me at, call (904) 396-3052, or visit CommunityWorks for more information. From San Antonio to Siberia, we're ready and willing to assist.

Thursday, June 26, 2008

Race Matters

As part of our continuing conversation about indicators of racial disparities, I'd like to draw your attention to some good work by the Annie E. Casey Foundation (from whom we get KIDS COUNT.) The specific project is called Race Matters, and they describe it as a
"toolkit is designed to help decision-makers, advocates, and elected officials get better results in their work by providing equitable opportunities for all. The approach described in the toolkit deals specifically with policies and
practices that contribute to inequitable outcomes for children, families, and
communities. The toolkit presents a specific point of view on addressing unequal opportunities by race and simple, results-oriented steps to help you achieve your goals. The following tools are designed to help you make the case, shape the message, and do the work."

A key component of the work is providing data to support your efforts (pdf). They have some interesting counsel on how to present data on race-based indicators that's worth discussing. (Warning: Long post follows!)

The goal of the workdook on data is straightforward.
"The guide hopes to assist you to produce data presentations that more
intentionally speak to the circumstances of all children. By lifting up ways in
which racial inequities shape opportunities differently, and identifying how to
remove barriers to opportunity, your data will be a resource that speaks more
clearly for everyone."
How do we do that? Here's what they suggest:
  • Select your indicators carefully. They point out that it's "easier to change what you measure than change what you don't." In particular, they advise using structural indicators over individual indicators. They prefer system-oriented indicators over people-oriented indicators, for at least two reasons. First, because structurally-oriented indicators lend themselves to structurally-oriented solutions, and the framework they're using is that of structural racism. Secondly, they fear a "blame the victim" approach where indicators are used to reinforce stereotypes or look to individual behavior changes to address needs. When the impacts on individuals need to be shown, then the framework or structure of the data presentation should lend itself easily to a structural interpretation.

    I found this conversation quite interesting. Every effort to measure indicators relies on inherent cultural assumptions and civic biases, as pointed out by David Swain at the recent conference of the National Association of Planning Councils. Race Matters is clear about where its assumptions and values frameworks are, which is important -- you can't do this work without making explicit what your assumptions are. However, I suspect the issue is bigger than that -- you can't measure indicators of racial disparities without serious community work and reflection about your role in the community and your organizational relationship to the issue at hand.

    In a recent effort to describe the development of our Race Relations Progress Report in Jacksonville, I wrote about an earlier attempt to create such an indicators report. Several factors resulted in that report not happening, but the relevant lesson is here:

JCCI had operated under the assumption that its experience with measuring
and presenting data objectively was sufficient to tackle this new
initiative. Instead, JCCI had to develop the community trust and cultural
competency necessary to create a community indicators report specifically
focused on the experience of a specific racial or ethnic group. The vision for
the report had to come through a different, open community process and be shaped from within the target community in order to combat “skepticism about the benevolent intentions of the larger (white) Jacksonville community in relation to African-Americans’ quality of life” and “distrust of information generated outside the African-American community.”Measuring the African-American experience against the broader community vision would have meant evaluating what it meant to be black in Jacksonville against an external standard; before JCCI could re-address the issue of race-based community indicators, the organization had to revisit its core assumptions about community indicators.

I suspect that even the most well-intentioned framework can't overcome the need for trust-building and cultural-competency work around issues of race and ethnicity. And given trust and cultural competency, the framework may well be less important (or modified to fit local needs). The assumption that presenting data in a certain way reinforces stereotypes is, in itself, somewhat of a stereotype, after all.

  • Choose asset-focused indicators whenever possible. While deficit-focused data can be "dramatic and alarming", Race Matters points out, and mobilize action, broader support is available with asset-based data. Assets-focused indicators, they suggest, "make it easier not to stigmatize families or individualize explanations for shortcomings." They
    are more likely to be "aspirational," while the steady drumbeat of negative news may serve to de-motivate the community.

    While I tend to agree in general on asset-based indicators, sometimes the results can get silly. We've tried publishing "employment rates" (rather than unemployment rates) or "percentages of babies born at healthy birth weights" (rather than low-birthweight infants.) Two things happen. People who understand data have to do the math in their heads to get the numbers that they're familiar with. And the result may seem like unnecessary contortions to avoid facing unpleasant news.

    When we released the first Race Relations Progress Report in Jacksonville, some members of the (white) community feared unrest in the streets if the negative data became publicized. The truth was very different. People in our African-American community already knew how bad things were. In general, it was the white community that was surprised, while the black community was surprised that they were surprised.

    Asset-based is fine. Just don't try to sugar-coat unpleasant truths. We have found that most people (white and black) have significant misperceptions about the extent of racial disparities in the community, and that the only way to address them honestly is to present the plain truth. You can't solve a problem until you admit it exists. And attempts to make it seem "not that bad" miss the point.
  • Selections of graphics, photos, and quotes. Race Matters advises you to use illustrations carefully, and "avoid imagery that moblizes stereotypes, such as picturing a child of color (no matter how precious looking that child is) beside a deficit-focused indicator." Not a bad suggestion, but be careful not to get too cutesy in making every photo appear staged. In Jacksonville, we avoided photos altogether for the first report, since the concern over image was starting to obscure the substance.

    In later efforts, like our targeted Infant Mortality study addressing racial disparities in infant deaths, photos were used but chosen with care. Good advice here.
  • Organization of the text. "The sequence of data matters," says Race Matters. How you structure the report helps tell the story you're intending, and indicators of race, income, and place tend to be "over-arching" indicators they should go first. As our Beyond the Talk: Improving Race Relations study put it, "Beyond particular factors related to personal prejudice, institutional practices, or individual choices, the pervasive effects of disparities in education and income mutually reinforce one another and deepen all other disparities." Because of that, we place those two set of indicators right up front.
  • Opening essay/letter to the reader. Race Matters recommends the letter to the reader found in the 2007 Data Book from Kentucky Youth Advocates ( as a way of setting the tone for the report. The summary/introduction to the report is critical. We have the content of that letter developed by our citizen review panel as they share what they think the most important take-aways are from the data presented. What does this all mean? Frame the data up front, so people will understand what they read as they approach the data. Again, be explicit in your values and purposes approaching the work, and the reader won't have to guess your motivation while reading the data presented. (Our key driver for the first report was to present inarguable information that could not be misinterpreted. That's not an easy aspiration.)
  • Consistent disaggregation. Race Matters points out, and I echo heartily, that where at all possible disaggregate your data by race and ethnicity the same way from indicator to indicator -- and when you can't, explain why! People get offended if they feel like their particular information was withheld or deemed less important than someone else's, and a simple explanation up front keeps you from months of conversations afterwards. Trust me on this one.
  • Deep analysis of disparities within a structural frame. Race Matters says, "the danger of not offering structural explanations as a frame for disparities is that this can mobilize readers' default prejudices about individual- and group-based explanations for inequities." I'll add that the danger of framing the indicators with only structural explanations is that you move far beyond a factual report to an advocacy position, and not all data support that position. A report that isn't willing to consider institutional, individual, and internalized racism as three interdependent components within a collusive system misses an opportunity to stand as the bedrock for community conversations about how to address racial disparities.

    If your report is to move a particular agenda, then by all means structure the data to support that agenda. But be prepared to have your report dismissed out of hand by anyone who does not support your policies. To have a report that transforms the community conversation, present the facts unabashedly without trying to explain why. Don't let your point of view interfere with someone coming to grips with unpleasant truth! The problem in communities tends to be (1) misperceptions about reality and (2) a failure to face that reality. Shared reality-based understanding of the problem is a precondition to shared action, and you can't leapfrog that process. To quote me again:
Generalizing Jacksonville’s experience suggests this: because racial disparities are widespread and significant, understanding them is a necessary first step for any effort in community improvement. Communities that bypass these important measures generally fail to understand, plan for, or address underlying fractures in the foundation of their community, and the efforts are usually not successful because of this. On the other hand, the implications of developing a shared understanding of actual racial disparities in the community are staggering; if lack of progress and arguments about public policy are rooted in misperceptions, reaching a shared, reality-based perception of the problem moves the community much closer to finding solutions. This was the experience in Jacksonville: tangible progress and shifts in public policy became not just possible, but inevitable, with a shared understanding of the problem.
  • Recognition of cultural variation in indicator applicability. Race Matters makes an important point here -- not all numbers mean the same thing in different cultural settings. Be careful. Here's another place where cultural competency is a prerequisite for quality work.
  • Need for solutions bundled with problem description. This depends on what you're trying to accomplish. If you're moving an agenda, then yes; if you're measuring community progress and want to take the role of independent, trusted data source, then by all means NO. In my opinion, the best indicator reports are descriptive, not prescriptive. Prescriptive reports get written backwards -- the recommendations drive the selection of data. Descriptive reports form a shared basis of understanding that allow the community to decide together what to do about the problem. (That shared understanding usually leads to a consensus for action, by the way. See Beyond the Talk for an incredible example.)
  • Getting "picky" about words and charts. I heartily agree. Be very, very careful what words you use. Have people read your report prior to publication and search for opportunities to misunderstand you. If it can be misinterpreted, it will be.

I'm interested in your feedback as well. Overall, I really appreciate this effort from Annie E. Casey. Their mission and my organization differ slightly, so some of their recommendations don't fit our work -- but what they've put together is an incredibly important addition to the field. I highly recommend it as a starting point for your community initiative.

Your reactions?


Post a Comment