Community Indicators for Your Community

Real, lasting community change is built around knowing where you are, where you want to be, and whether your efforts are making a difference. Indicators are a necessary ingredient for sustainable change. And the process of selecting community indicators -- who chooses, how they choose, what they choose -- is as important as the data you select.

This is an archive of thoughts I had about indicators and the community indicators movement. Some of the thinking is outdated, and many of the links may have broken over time.

Monday, April 30, 2007

Measuring Personal Happiness

Shawn Tuttle, of Project Simplify, suggests that we look at Redefining Progress' Genuine Progress Indicator and adapt it to our personal lives.

Shawn writes:

We can increase the efficacy of our efforts with useful indicators. Something to remind us of our chosen direction, something by which to evaluate the progress made since the last marker. Just as important as the use of of markers, is what is being measured. Indicators that are aligned with your values unify your efforts, clear your course, and turbo-boost you towards your goals.

The importance of effective indicators is evident when considering how you would know that you are on track if you didn’t have them. When hiking a path for the first time, we depend on landmarks, trail signs, and a cleared path to know that we are on track.

Markers based in your values serve as those landmarks and trail signs. You’ve heard the stories of people who wake up one day, look around them, and realize they are in a career they don’t care about. Where were the markers to keep them on track with their dreams?

For those of us who work with community indicators, how often do we put the same concepts into practice in our own lives? Do you have a personal set of indicators that keeps you on target towards your personal vision of an improved quality of life?

Be sure to read the rest of the article here. Then check out a similar blog entry at Get Shouty's blog.

Again leaning on the experiences of Bhutan, the author here turns not inward, to the self, but outward to the community, in exploring the attempts to create a place of happiness in Australia. Be sure to check out the photo -- I hope that brought a smile to your face.

From the article:

Much closer (for me) to home is a street campaign that I saw when I was last in Melbourne. I was reminded of it today through a post Australians taking it upon themselves to create a vibrant community on Mack-tastic’s The Viral Garden.

This local government has started a series of Sustainable Community Progress Indicators (SCPI) which compares and prompts the sight of smiles across the city.

I believe that this is an expression of an initiative of VicHealth. Suggestions to “smile at the person next to you” at bus stops, “walk instead” in car parks and “catch up with friends” are among 50 messages that will be shared with the population.The health promotion organisation will also stencil streets in the cities of Darebin, Melbourne and Bayside in a bid to encourage people to get active and socialise as part of every day life.

Is it reasonable to think of "happiness" as part of a set of community indicators? How would you go about measuring it? And what can a community do to change its "happiness index" -- is VicHealth on the right track?

Read more ...

Soda Pop Statistics

The second most important question when ordering a non-alcoholic beverage in the American South is what to call it. (The first, of course, is "sweet or unsweet"? That's iced tea for the non-Southerners reading this.)

Ordering a "Pop" marks you as a tourist, instantly. Depending on how urbanized the environment you're in, you can sometimes get away with "soda." But in the kind of town where you can't call a cab, what you're ordering is a "coke", even if it's a Sprite or a Mountain Dew.

What is this handy bit of information doing on a blog devoted to community indicators? Thanks for asking. The Great Pop vs. Soda Controversy is a website with statistics, mapping, and surveys to track this critical information on a county-level basis across the country. A more detailed map charting 120,464 responses is also available.

There's other fascinating data sets out there, if you start looking. The Dialect Survey is interesting. The Common Census Map Project charts the geographic radius of the various cultural and economic centers in America. Richard Florida's Singles Map charts the cities with greater numbers of single men or single women compared to their counterparts. On a sobering note, Maps of War has a series of maps, including the deadliest wars, history of the Middle East, and growth of religions over time.

Please share other interesting data sets and maps -- I don't know how useful these are, but wouldn't it be fun to compare social issues and beverage labels to find any correlations? Maybe it's time to buy the world a Coke -- then again, maybe not.

Read more ...

Saturday, April 28, 2007

Emotional Mapping Your Neighborhood

You may have seen this article in USA Today -- Artist uses polygraph, GPS to create 'emotional maps' of cities. The article tells of the work of Christian Nold, who sends volunteers into the community with handheld devices that measure both geography and emotional intensity. Then he records that information and plots in on a map. See what it looks like at his San Francisco Emotion Map page.

From the USA TODAY article:

He's the first to acknowledge the intimate portraits that result from his endeavors won't help a confused tourist get from Fisherman's Wharf to Golden Gate Park.

Instead, by taking polygraph technology out of the criminal realm, his goal is to offer a commentary on the subjective nature of reality. Maps, he notes, have always been influenced by whomever makes them, citing as an example the globes that used to show Europe as being considerably larger than Africa.

"There are different ways of mapping the city that aren't strictly about the practicalities or financial sensibilities that we usually guide our urban planning with," said Nold, 31.


I'm not sure if there's a practical aspect to this artistic effort -- yet. But the possibilities are exciting. Go to Christian Nold's website and look at the work he's doing -- he has created maps in Greenwich and Fulham -- especially look at the Mapping Fulham site (the work continues through June) -- see the map of the U.S. created as the "US geographical [b]oundries of states are reshaped by the amount of news coverage that this state receives." The link to the Charles Booth Poverty Map of Fulham 1898 raises an interesting possibility of using time-series maps to understand community conditions.

Nold adds the following: "I am very keen to collaborate with individuals, groups and institutions that are interested in social, political and environmental innovation and sustainability." If that sounds interesting, get in touch with him. This appears to be a fascinating opportunity to reshape how we communicate data to the community.

Read more ...

Thursday, April 26, 2007

Community Indicators Conference in Spokane

If you haven't been following the story in the news feed (on the left on this blog's home page), The Spokesman Review has been reporting on a conference on community indicators held at Eastern Washington University Wednesday, April 18, 2007. The conference highlighted the Spokane Community Indicators Initiative and brought "local, regional, and national speakers for a one-day even to showcase indicators in action." Charlotte Kahn, from the Boston Indicators Project, was the keynote speaker.

There's a write-up (quoting the Spokesman Review story) on EWU's blog. It's an interesting view of indicators from someone not working in the field, and worth a look.

My favorite quote about the community indicators project comes from Patrick Jones, from the Institute for Public Policy and Economic Analysis at EWU, who said to the Spokesman's reporter:

"If we, as a community, don't do much with all this knowledge," Jones wrote, "It will be a major opportunity lost."

One of the aspects I like about the indicators project is its links to other projects -- as you remember from a previous post, these lists can be invaluable when you're searching for a measurement that keeps eluding your grasp. Plus they link to my organization, which shows impeccable taste. Thanks, Patrick.

The other aspect I really like isn't who the project links to, but who links to them. The City of Spokane Office of Neighborhood Services links to the community indictors project on their homepage, as one of their "most frequently requested links." SNAP -- Spokane Neighborhood Action Programs -- also links to the project. And so does Remi, who makes me feel really old and not hip. (Who would have dreamed that one day people would be linking to community indicators initiatives on the same page as links to Booty Go Thump or The Ballad of the Emo Kids?)

If you're checking, those are government, non-profit, and, um, "other" types of organizations linking to the same core set of information to know more about the community. And that's an exciting aspect of community indicators -- shared knowledge of community issues, leading to collective action and shared responsibility for change. What more could you want?

Read more ...

Wednesday, April 25, 2007

Measuring a Vision

Community indicators operate as a critical part of a community change process. David Swain articulated the relationship among a vision for the future, community indicators, and meaningul improvement in his essay, Measuring Progress: Community Indicators and the Quality of Life:

Vision: The impetus toward community improvement originates with how a community values itself and what vision it has for its future. All communities have some sort of vision and at least some shared values, although these may not be consciously articulated. Some communities have sought to define and express their visions through complex collective processes and impressive documents.

Indicators: For an improvement effort to emerge, some knowledge must exist about the current situation. Indicators tell graphic stories (figuratively and actually) about specific aspects of life and wellbeing in the community. If tracked over time, they offer a moving picture of community trends in the recent past. These trends can be followed for understanding. They can also be compared with the community’s vision. The resulting comparison of reality with vision can become the basis for determining improvement goals.

Indicators alone, however, are insufficient to instigate action for improvement. In particular, they often reveal little about the underlying causes of the trends they display. Nor do they usually provide clear direction toward how to accomplish improvements. The most important roles indicators can play at this stage are to raise consciousness among citizens and decision makers, to reconfigure priorities among issues most deserving of community attention, and to shape the agenda for public consideration of action and allocation of resources.

Planning: Once an issue has “arrived” on the community’s agenda, action must be preceded by planning. This planning may include research into causes and solutions and development of strategies and priorities.

Advocacy and action: The results of the planning, along with some key indicators, may become the basis for advocacy efforts. These might include public campaigns by citizens organizations and interest groups, as well as formal lobbying and decision making within the halls of government. Presuming success in advocacy, some form of action follows, whether through a new initiative, program, or organization, or perhaps through implementation of a new law or ordinance.

Outcomes: The actions produce results, both immediate outputs and broader, longer term outcomes. From the perspective of the community’s vision and the indicators that guided its planning, documenting and understanding the outcomes are of paramount importance. They form the basis for measuring success, or at least progress.

Assessment: As articulations of the vision and the basis for community goals, the indicators play a second important role by providing the basis for evaluation of the results. If the planning, advocacy, and action have been consistent with the vision and indicators, the outcomes will reveal progress—or lack thereof. Either is a valuable lesson for a community. Successes deserve celebration, while disappointments deserve attention toward greater improvement.

Feedback: Since the real-life, community- improvement process is incremental and iterative, the primary value of assessment is to set up another round of improvement efforts. Most frequently, the assessment feedback loops back to the planning stage in search of better understanding of causes and development of more effective solutions. In some cases, unexpected results may lead a community to rethink its indicators or even to question its vision.


There are a number of examples of indicator efforts that follow this model. One interesting report is produced by the Kentucky Long-Term Policy Research Center. The report, Visioning Kentucky's Future: Measures and Milestones 2006, outlines vision statements and goals, then measures progress towards those goals and advises the state legislature and governor on priority needs for action. From the report's description:

The report is organized around five sections: communities, education, economy, environment, and government. Within these five areas are 26 long-term goals derived from a citizen vision for the Commonwealth's future. The report includes over 100 benchmarks, trends, or indicators that are measures of the progress made toward each goal and the results of a statewide opinion poll that gauged citizen assessments of progress and the importance of each goal.

You can then see a presentation on Future Trends & Current Public Policy -- pay attention to the planning for the future that the indicators allow/encourage. The indicators themselves report on the success/lack of success of public policy changes to address needs identified.

What are your favorite examples of indicator projects that link community visions to public action?

Read more ...

Tuesday, April 24, 2007

Redefining Progress

There's something new at Redefining Progress. Already well known for their work on the Genuine Progress Indicator, their work on sustainability lets you test your Ecological Footprint and your Office Footprint (or get a more in-depth Ecological Footprint Analysis.) In addition, they have kid-friendly information in their Kids Footprint section.

What's new (to me) is the section on Ecological Fishprints. In their words, this effort seeks to do the following:

In collaboration with Daniel Pauly and the Sea Around Us Project at the University of British Columbia, Redefining Progress is working to adapt the popular ecological footprint tool to more accurately quantify the impact of capture fisheries and aquaculture on marine ecosystems. Current global footprint accounts show that our use of fisheries is sustainable, clearly at odds with the reality of widespread overfishing. Our adaptation—the Ecological Fishprint—is a research tool for measuring the spatial extent of humanity’s appropriation of marine ecosystems that remedies some of the known shortcomings of standard footprint analysis. Our Ecological Fishprint tool can be used to assess the ecological impacts and overall sustainability of fisheries production and consumption at the global and national levels.

Pretty neat stuff. You can read more about it here.

While you're visiting their site, be sure and check out the Community Indicators Handbook. Now in its second edition, this work is a great overview of the field. You can read the introduction at Tyler Norris' website here. The introduction lays out clearly what community indicators are, why we measure them, what they do for communities, and even adds a glossary of terms.

There's more to discover at Redefining Progress, including their sections on Sustainability Indicators and their Environmental Justice and Climate Change Initiative. A number of publications are available in PDF format for download.

So check it out. They've got a new development director, and are looking for support for their work. Armando, good luck with the fund-raising!

Read more ...

Monday, April 23, 2007

Data Source Lists

Every now and then I run across websites that provide a list of data sources for indicators. Sometimes these data source lists are state-specific, but often they provide a wealth of information for someone searching for just what agency or organization holds the information they so desperately need.

Nationally, you can't go wrong if you start with FedStats.gov, which pulls together in one place links to the statistical information provided by U.S. government agencies. You can search the data by topic or by agency, and even see data profiles on MapStats.

In Florida, a nice list is provided by the University of Central Florida Libraries. The page organizes sources in Demographics, Education, Criminal Justice, Health, and Other Topics, then lists Other Resources to find information. What's really helpful about the site is that they provide the following information about each data source they link to: Name of the resource, counties covered, geographic levels (such as city or census tract) for which data are available, demographic information (such as gender or race) provided with the data, and both topics and dates covered by the data.

California is well served by the Community Services Planning Council in Sacramento, which provides links to 225 data sources covering topics from Agriculture to Transportation. The links again cover national to local data sources, so take a look to see if they have the data you need.

If you're in the Northwestern United States, don't miss the Northwest Area Foundation's Indicators Website to get a lot of information quickly on what you need. You can get information at the county level for the states from Washington/Oregon to Iowa/Minnesota. In addition, the data source information is at the bottom of each indicator page, so even if you're not from those states (or if the indicators aren't exactly what you want) there's a starting place to go find more data.

Do you have any other data source lists that you find useful? If so, please share them.

Read more ...

Thursday, April 19, 2007

Social Indicators

Michael Kruse ran a series on American Social Indicators on his Kruse Kronicle blog that raised some interesting questions.

According to the data he pulled together on Infant Mortality and Life Expectancy, Suicide, Crime, Substance Abuse, Family Formation and Sexuality, Education, Economic Status, Ethnicity, and the Environment, he concluded "that the worst year in American history over the last fifty years was probably about 1981." Since then, the data have been generally trending positively in the areas he identifies, which ought to mean that we should be feeling pretty good about life in the United States. (Be sure to check out the series of posts to see what he means.) He titles his conclusion "American Social Indicators 2006: Getting Better but Feeling Worse."

At the same time, Gallup has been tracking Satisfaction with Personal Life since 1979. The most dissatisfied responses were in 1982, and the most satisfied in 2000 and 2003. The trend lines in satisfaction rates remain largely unchanged since 1996, surprisingly.

Part of the problem is that Kruse relies on objective data to measure some social trends, but then falls back on anecdotes to report that people seem "gloomy" today. Gallup also asks about people feeling happy, and those results are interesting -- 96 percent of Americans describe themselves as "very happy" (49%) or "fairly happy" (47%). Both money and marriage make a difference in how happy and satisfied people describe themselves.

The opportunity is interesting -- to construct a set of indicators that describe social conditions, and run those in parallel with survey questions about personal satisfaction and happiness.

"The world is so full of a number of things, I'm sure we should all be as happy as kings," Robert Louis Stevenson said. But are we?

Read more ...

OECD World Forum

The Organisation for Economic Co-operation and Development (OECD) is holding its Second OECD World Forum on "Statistics, Knowledge and Policy", Measuring and Fostering the Progress of Societies, June 27-30, 2007, in Istanbul, Turkey.

Here's why they're putting on the forum:

Around the World, societies are increasingly concerned with their quality of life. And a consensus is growing around the need to develop a more comprehensive view of progress – one that takes into account social, environmental and economic concerns – rather than focusing mainly on economic indicators like Gross Domestic Product, which while an important measure of economic activity, was not developed to be the sole measure of a nation’s progress.

Click here for a link to the conference agenda, scheduled speakers, and other information.

Enrico Giovannini, Director General of Statistics for the OECD, provided this commentary (PPT) on the importance of indicators and what OECD is trying to accomplish when he spoke to the Community Indicators Consortium conference in March.

In a pre-forum this March in Rome, Italy, they discussed Dynamic Graphics for Presenting Statistical Indicators, about which I'm becoming increasingly interested as the breakthrough technology to overcome natural resistance and latent statistical illiteracy/innumeracy. The presentations given at the pre-forum are available on-line, as are links to several different tools for displaying data. (No rollercoaster game software listed, however.)

The OECD has also put a great deal of thought into social indicators. Here is their description of why and how they measure social indicators:

Social policy covers a great number of issues that do not stand on their own but, as is increasingly recognised, are diverse and interlinked. To provide this broad perspective Social Indicators have been developed to aim to serve the need for a concise overview of social trends and policies while paying due attention to the different national contexts in which such policies are being pursued.

As a result, they publish Society at a Glance: OECD Social Indicators - 2006 Edition, which outlines what they measure and how different nations of the world are doing in meeting the challenges identified.

If you haven't taken the opportunity to see the work OECD is doing, make some time to review the data and the information on their website. If you're going to Istanbul, let me know!

Read more ...

Tuesday, April 17, 2007

AARP's Community Indicators

For those looking for indicators for older persons, AARP has a set of community indicators available on their website. You enter in your zip code (or city/state you're interested in), identify the geographic size of the community you're interested in finding out more about (anywhere from 5 to 100 miles from the zip code you enter as a radius), and select an indicator on the left. Attractive, clear graphs appear.

On the plus side, there's quite a bit of information, the presentation is clear, and the user interface is simple and intuitive.

On the minus side, besides the assertion that the information is "up-to-date" and that data are provided by OnBoard LLC ("Information is deemed reliable but not guaranteed,") no metadata information is presented about the data. You don't know the year of the data, or the sources, or the means of measurement. Some of it appears to be Census data, but because of the radius piece I can't really determine the geographic boundaries that are being used nd I can't sync it very well with the data sets I'm used to.

I'm sharing this both for the information purposes about measuring quality-of-life aspects that matter to older persons, and for the ease of use factor. Too often indicator websites are horribly daunting to approach for the statistical novice. However, without the necessary metadata, the information is largely useless for decision-making or planning.

Read more ...

Monday, April 16, 2007

Making Data Display More Exciting

The Social Science Statistics Blog hosted by The Institute for Quantitative Social Science at Harvard University is always an interesting read.

A hat tip goes to them for directing me to an exciting way to display data. Speculative Bubble put together a chart of home prices, indexed for inflation, from 1890 to present. Then they used Atari's Roller Coaster Tycoon 3, a game program, to create a roller coaster that follows the trend lines and lets you "ride" the graph. See the video and enjoy the ride. Then check out the actual chart and see if experiencing the chart was more informative than just seeing it.

What other opportunities have we not explored yet for making data more experiential? Are there other exciting options for providing much more immediate experiences with trend lines?

Read more ...

How to Lie with a Graph

This morning, I received a copy of the Heritage Foundation's graph (PDF) "providing the facts" of tax cuts at work. If you've read Edward Tufte's work, then you can see what the problems are with how the data are displayed.

In The Visual Display of Quantitiative Information, we learn about the "Lie Factor", which he defines as:

Lie Factor = size of effect shown in graphic
-------_____---------size of effect in data


He then makes this point (p.57 of his book, if you're following along):

"If the Lie Factor is equal to one, then the graphic might be doing a reasonable job of accurately representing the underlying numbers. Lie Factors greater than 1.05 or less than .95 indicate substantial distortion, far beyond minor inaccuracies in plotting."

Let's look at the Heritage Foundation graph, then. It's on a PDF file, so the proportions are not skewed from the intended display.

The graph shows the number of jobs from January 2000 to January 2007. The January data points are provided, along with a drawing of a worker. The y-axis is scaled from 132 million to 146 million.

The low point on the graph is January 2002, with 135.7 million jobs. The high point is January 2007, with 146 million jobs. The percentage change is 7.59%.

The figures, however, tell a different story. The January 2002 worker is 3/4 inch tall. The January 2007 figure is 3 inches tall, a 300% difference. The "Lie Factor" (300/7.59) is 39.5, which is outside the range of 1.05-.95.

There's an important story to tell with the data on job creation. But distorting the data doesn't provide the facts or tell the story accurately.

Read more ...

Friday, April 13, 2007

Another Blog and a Free Video about Data

Another new blog about community indicators has been created -- this one in Sarasota, Florida. It's sponsored by SCOPE, a really neat organization that brings the community together for positive change, and uses indicators as part of a toolkit to bring sustainable improvements to Sarasota. And the people that work there are just good folk.

So drop on over to the SCOPE Community Report Card Blog and see what's happening.

Check out the video they link to -- it's Dr. Hans Rosling (of the Gapminder Foundation) discussing data. (Or go directly to gapminder.org to see his website.)

If you have a community indicators blog you'd like us to link to, please let us know.

Read more ...

Thursday, April 12, 2007

Blogger's Code of Conduct

There's an effort led by Tim O'Reilly to develop an online Blogger's Code of Conduct. Because I hope this blog is a model for civil public discourse (not that community indicator folks are known for anything different -- very few discussions about metadata standards invoke Godwin's Law, for example), I'm sharing the link to the Blogger's Code of Conduct Wiki and invite all who post here to adhere to its standards of decorum.

If there are any questions about the Code of Conduct or the responsibilities it places on those who host or post to this blog, please let me know.

Read more ...

Wednesday, April 11, 2007

New HUD/USPS Data Set

From Huduser.org:

HUD Aggregated USPS Administrative Data On Address Vacancies

HUD has entered into an agreement with the United States Postal Service (USPS) to receive quarterly aggregate data on addresses identified by the USPS as having been “vacant” or “No-Stat” in the previous quarter. HUD is making these data available for researchers and practitioners to explore their potential utility for tracking neighborhood change on a quarterly basis. The potential power of these data is that they represent the universe of all addresses in the United States and are updated every three months. Under the agreement with the USPS, HUD can make the data available publicly at the Census Tract level provided users agree to the terms and conditions of the click-on sublicense.

The basic data being provided by the USPS are:

  • Total Number of Addresses - This reflects all addresses (residential and commercial) that USPS has recorded in their database.
  • Total Vacant Addresses - These are addresses that delivery staff on urban routes have identified as being vacant (not collecting their mail) for 90 days or longer.
  • Total No-Stat Addresses - There are many reasons an address can be classified as No-Stat, including:
    Rural Route addresses vacant for 90 days or longer
    Addresses for businesses or homes under construction and not yet occupied
    Addresses in urban areas identified by a carrier as not likely to be active for some time

While HUD is still exploring the utility of these data, it has identified the following items that may be of use to other researchers and practitioners:

  • Vacation/Resort areas have very high rates of vacant addresses.
  • Areas with high growth have high rates of No-Stat addresses as do areas of significant decline. One way to distinguish these two areas is by comparing Total Count of AMS Addresses between quarters. An increase in AMS addresses with a similar increase in No-Stat addresses likely reflects new construction/additions. No-Stats with a stable or reduced number of addresses probably reflect long-term vacant addresses.
  • In distressed areas, a reduction in total AMS addresses from quarter-to-quarter appears to be a strong indicator of where demolition is occurring. (Note that if a building is demolished to be replaced by another building, the address will likely be moved to No-Stat status and not be removed from the total number of addresses).

HUD is very interested in what other researchers/practitioners learn from using these data. Please send questions or comments to Todd Richardson with the subject line USPS Data.

Thanks to Peter Tatian and the NNIP listserve for the heads-up.

Read more ...

Tolerance indicators

In this post I discussed some of the available data to support indicators around the demographics and quality-of-life aspects of the gay. lesbian, bisexual, and transgendered (GLBT) population in the community. Besides Census information, which is inadequate at best, there's not much to go on, and national estimates don't work at the community level because what is available suggests concentrated populations in a relatively few cities, likely due to the differences in perceived tolerance levels in communities.

A different way of measuring tolerance towards the GLBT community might be through attitude and perception surveys of the general population. The Gallup organization has been polling Americans about these issues since at least 1977, and some interesting trends are available for comparison in local communities.

At GallupPoll.com we find some interesting trends tucked behind the question of a constitutional amendment defining marriage. Trendlines are available for questions like:

  • Do you think homosexual relations between consenting adults should or should not be legal? (In the 1980's, 57% surveyed said illegal; in 2006, 56% said legal)
  • In general, do you think homosexuals should or should not have equal rights in terms of job opportunities? (In 1977, 56% were in favor of equal rights; in 2006, the number was 89%)
  • Do you feel that homosexuality should be an acceptable lifestyle or not? (In 1992, 57% said not; in 2006, 54% said yes, acceptable)
  • Would you like to see homosexuality be more widely accepted in this nation, less widely accepted, or is the acceptance of homosexuality in this nation today about right?
  • In your view, is homosexuality – [ROTATED: something a person is born with, (or is homosexuality) due to factors such as upbringing and environment]? (1977 -- 13% born with; 2006, 42% born with)

There are more questions and interesting data sets available from Gallup. But they're not the only ones with data sets and survey questions that might prove useful to a local community trying to understand indicators of tolerance beyond Richard Florida's Gay Index. The Pew Forum on Religion and Public Life provides other survey information and analysis, including a question about whether acceptance of homosexuality would be good for the country or bad for the country. That's an interesting question to adapt to the community level.

The Pew Research Center on People and the Press is also asking interesting questions. Here they analyze attitudes towards homosexuality in the arenas of marriage, military service, and adoption.

In a different approach, the Kaiser Family Foundation surveyed the homosexual community and the larger public to examine issues of acceptance and discrimination. The 2001 study, Inside-OUT: A Report on the Experiences of Lesbians, Gays and Bisexuals in America and the Public’s Views on Issues and Policies Related to Sexual Orientation (PDF file), provides a perspective and an opportunity for local communities to measure tolerance differently.

What other survey questions do you know of (or would you suggest) to explore questions of tolerance in a community?

Read more ...

Tuesday, April 10, 2007

How "Tax-Friendly" Is Your City and State?

While discussing economic indicators, I received a site that may prove interesting to some of you. CNNMoney.com provided a 2006 data set ranking states on tax-friendliness (measured as the per capita tax burden as a percentage of the per capita income.)

The data set also includes the 51 largest cities, and is provided in sortable tables.

The 2007 data set includes state-by-state and city comparisons, but adds to that income tax, sales tax, property tax, and retiree breaks information. The family tax burden comparison ranks 51 cities by the income, property, sales, and auto taxes on a family of three making $100,000 per year. (Jacksonville, Florida, where I live, ranks 49 out of the 51 cities.)

However, in this chart, they don't factor per capita taxes by per capita income, which I found a more useful comparison in the 2006 data set (though Jacksonville's ranking didn't change between the two formats.)

You may find this data useful -- though you may have some suggestions how how to think about tax data as part of a community indicator set. A ratio of per capita taxes compared to per child education funding may be interesting, for example -- the same data labeled as "tax burden" could also be labeled "government services support" or something similar, after all.

Read more ...

Economic Indicators

Last week, a White House spokeperson made the claim that "the economic surge that began five and a half years ago on President Bush's watch is more robust than the much-touted expansion during the Clinton administration."

"This is a much stronger expansion in a lot of ways," White House spokesman Tony Fratto told The Examiner. "It's much deeper and more measured."

The Examiner provides "dueling data points" that examine the question:

From the Bush camp:

  • Real wages rose 1.8 percent over the 12 months through February. This is substantially faster than the average rate of wage growth in the late 1990s.
  • Since the first quarter of 2001, productivity growth has averaged 2.8 percent. This is well above average productivity growth during the Clinton years.

From the Clinton camp:

  • Under Clinton, the economy created 3.5 times more jobs after 74 months than it did over the same period of time under Bush.
  • During the Bush years, the number of Americans below the poverty line has increased by 5.37 million, while under Clinton the number fell by 7.68 million.

An editorial in the Investor's Business Daily looks at the same claim, and asks: But how do you measure "stronger"?

The economic indicators that they use are the unemployment rate, new job creation, real after-tax income, and real wages. And from that they conclude the Bush economy is "stronger" than the Clinton economy.

The Economic Policy Institute uses different indicators to argue the picture is different:

This much is clear: the current recovery substantially lags the historical average in GDP growth, employment growth, investment in equipment and software, and, with the deflating housing market, even in residential investment. Conversely, corporate profit growth in the current recovery (despite a 3% dip in the last quarter of 2006) has been almost twice as rapid as in the past.

Dr. Larry Mishel of the Economic Policy Institute showed some of the limitations of economic indicators at a joint conference of the National Association of Planning Councils and the Community Indicators Consortium in 2005. His powerpoint presentation is here.

How do you measure economic vitality? How do you measure local or community economic vitality? In Jacksonville, Florida, an economist at the local university provides a set of local economic indicators that measure indicators of local industry stock performance, unemployment rates, consumer price index, and a leading economic index. Dr. Paul Mason provides a page of links to international, national, state, and local economic data. A separate community effort provides a broader set of community economic indicators that include measures of poverty and government assistance as well as average wages and unemployment but lacks the timeliness of Dr. Mason's work.

What are the best indicators to measure the strength of the economy at the community level?

Read more ...

Monday, April 9, 2007

2010 Census Proposed Criteria

From the NNIP listserve:

The Census Bureau has published proposed criteria for census tracts, block groups, census designated places (CDPs), and census county divisions (CCDs—defined in 22 states as the statistical equivalents of minor civil divisions) for the 2010 Census in the Federal Register on April 6, 2007. All interested individuals and organizations are invited to review and
comment, as appropriate, on the proposed criteria for these statistical areas. Each of the Federal Register notices is available on the Census Bureau’s Participant Statistical Areas Program website at < http://www.census.gov/geo/www/psap2010/psapcriteria.html> as well as via the Federal Register’s website at <http://www.gpoacess.gov/fr/index.html>.

General information about the 2010 Participant Statistical Areas Program is available on the Census Bureau’s website at < http://www.census.gov/geo/www/psap2010/psap2010_main.html>

Requests for additional information about these statistical areas as well as copies of the proposed criteria Federal Register notices should be directed to Michael Ratcliffe, Chief, Geographic Standards and Criteria Branch, Geography Division, U.S. Census Bureau, via e-mail at geo.psap.list@census.gov or telephone at 301-763-3056.

Comments on the proposed criteria for these statistical areas should be provided in writing to the Director, U.S. Census Bureau, Room 8H001, Mail Stop 0100, Washington, DC 20233-0001. Written comments must be submitted on or before July 5, 2007.

In summary, the proposed changes to the criteria for census tracts, block groups, CDPs, and CCDs are:

Census Tracts

  • Lower the minimum population threshold for most tracts to 1,200.
  • Housing units counts may be used in addition to meet tract thresholds.
  • All types of populated tracts should meet the same thresholds.
  • Wherever possible census tracts should conform to American Indian reservations.
  • Special tracts may be created for large special land use areas without housing units or population.

Block Groups:

  • Increase the minimum population threshold to 1,200.
  • Housing units counts may be used to meet block group thresholds.
  • All types of populated block groups must meet the same threshold.
  • Wherever possible block groups should conform to American Indian reservations.
  • Special BGs may be created for large special land use areas without housing units or population.

CDPs:

  • A CDP cannot have zero population and zero housing units.
  • A CDP cannot be coextensive with a governmentally active minor civil division (i.e., town, township, charter township, plantation). This change will reduce redundancy in place and county subdivision data tabulations for the following states: Connecticut, Maine, Massachusetts, Michigan, Minnesota, New Hampshire, New Jersey, New York, Pennsylvania, Rhode Island, Vermont, and Wisconsin.
  • A CDP must represent a single, distinct community. A CDP that represents multiple, distinct communities, and the hyphenated name typically assigned to represent such CDPs, will not be permitted. Exceptions will be made for communities whose identities have merged and in which both names commonly are used together.

CCDs:

  • The Census Bureau is questioning whether to retain or eliminate CCDs as geographic entities. If eliminated, CCDs would not be replaced by other sub-county geographic entities.

Read more ...

Saturday, April 7, 2007

Open Source GIS Tools

More and more open source GIS tools are becoming available, and keeping track of what's what and what the latest and greatest options are is tough.

So special thanks go to OpenSourceGIS, which is attempting to build a complete index of all open source and free GIS projects out there.

As of March 27, 2007, they have 238 projects listed. JasPer, GeoJasPer, MapWindow, Splat, OGLE, Maya 2 Google Earth, Portfolio Explorer, GeOxygene, JCS, RoadMatcher, Kalypso-Simulation-Platform, KIDS, Rgeo, GEOS, p.mapper, NetTopologySuite, GeoVista Studio, Kosmo have all been added since 6/1/06.

If that's too much, a shorter list of free GIS resources is available here.

If your project uses an open source or free GIS product, please let us know (and send us the link). Thanks!

Read more ...

Friday, April 6, 2007

Community Indicators in Connecticut

If you're in Connecticut and are interested in community indicators, check out this new blog and let them know if you're interested in developing a communications network.

If you're not in Connecticut, drop by anyway and lend your moral support to the cause.

Read more ...

Local Government Performance Benchmarks

The link between citizen-driven community indicators projects and government-driven performance benchmarks is often understated or overlooked. Community indicators projects generally rely on local government for at least some of the data sets that they report, and benchmarking data are often included in that mix. For example, both the internal benchmarking processes of the local police department and our community's indicator report use the same data on police response times. The government agency and the community both want the same thing -- improved performance -- and using the same data helps deliver a shared message of when improvements have been made (or are needed).

The work on how to better tie together citizen-driven community indicators and government benchmarks is progressing rapidly. Results That Matter is both the name of a book and a website that outlines how effective community governance ties together the notions of community problem solving, citizens reaching for results, and organizations managing for results, into communities governing for results. I highly recommend the book, and the website provides a clear overview of the message it contains.

The Governmental Accounting Standards Board (GASB) Service Efforts and Accomplishments (SEA) Program provides a guide (Government Service Efforts and Accomplishments Performance Reports: A Guide to Understanding) and a report (Special Report: Reporting Performance Information: Suggested Criteria for Effective Communication) that help governments understand how to report out information in ways that are useful for the communities they serve.

Last month I told you about a national effort to connect local and statewide efforts into a network for public performance measurement. A similar effort is happening in the Southeastern United States, led by the Community Research Council of Chattanooga, Tennessee. If you're interested, drop them a line to see how you might get involved.

Read more ...

Thursday, April 5, 2007

Indicators of Caring

The United Way's State of Caring Index includes 36 indicators covering the topics of financial security, health, education, safety, charitable giving, volunteerism, civic engagement and the natural environment. Data is available for the nation as a whole and for all 50 states.

When I use the tool to measure how my state (Florida) is doing, I find Florida consistently ranking in the bottom 10 states. (That's sure to be influenced by per capita giving to the United Way, which hovers between 37th and 43rd in the years covered in the data.)

I was a little disappointed that the data for the volunteerism rate, which I would suspect to be a key component of a Caring Index, were either missing or seemed to have a national rate superimposed on each state. Local surveys show Jacksonville's volunteer rate to be much higher than the national average -- local folk give more time than they do money -- and the Index would have been more helpful if it could have captured that difference.

(I would have also liked to see philanthropic giving, even if limited to United Way contributions, measured against local average household income, rather than in straight per capita dollars. Five percent giving in one area is a lot more than five percent giving in another, though it would be hard to argue that the level of "caring" is substantially different.)

I did like the inclusion of "municipal solid waste recovered" (which I think is a measure of recycling in a community) as part of a Caring Index -- it was a nice way to broaden the perspective. Voter turnout was another nice surprise -- a measure of civic caring. Per pupil expenditures in the school appeared to be a measure of how much the community cares about education -- an interesting try, though perhaps other measures may have been more illustrative.

Some of the measures, however, were of social needs and problems, such as teen birth rates or infant mortality, which seemed out of place. And student achievement scores seemed to force the same assumption -- does poor academic performance necessarily mean that the community doesn't care about the student? I can see the argument, though it appears to be a stretch. Under that premise, what indicators could you exclude from a "Caring Index"?

If you were to design a set of indicators that measured "caring" in your community, what would you include? Do you currently measure these kinds of indicators? If so, what do they say about your community?

Read more ...

Beyond the "Gay Index"

Recent local work in understanding the status and impact of the Gay, Lesbian, Bisexual, and Transgender (GLBT) community in our area resulted in a data query (as any exploration often does.) Do I know of a good indicator or set of indicators that would help measure the demographics and quality of life of the local GLBT community?

Richard Florida, in his book The Rise of the Creative Class, advocated the use of a "Gay Index" as a key measure of tolerance in a community. (You'll remember that it was the three T's of Talent, Tolerance, and Technology that, for Dr. Florida, predicted a Creative Community.)

So do we have a good indicator for the GLBT community? Not quite. The "Gay Index" used same-sex unmarried partner household data from the U.S. Census as a proxy for gay couples in the community as a percentage of total households. Similar measures (such as this measure of the gayest zip codes in America from gaydemographics.org use the same data set, which places the national estimates of partnered gay households at 0.99% of the population in 2000 (1.16% in 2004, using American Community Survey estimates.) The data are self-reported survey information.

As a conversation piece or a comparative indicator of tolerance, the data have its uses, but as a quality-of-life or demographic measure of the local GLBT community it is extraordinarily limiting. The data do not measure non-partnered individuals, for example.

The problems with the data available keep growing as definitional questions arise. The National Center for Health Statistics reports the 2002 National Survey of Family Growth found 6.5 percent of men ages 25-44 have had sex with another man, and (in a differently worded question) 11 percent of women reported having had a sexual experience with another woman.

However, these experiences do not appear to correlate with self-identification as gay or lesbian. The National Health and Social Life Survey found the rates of self-reported homosexuality to be 1.3% for women within the prior year, and 4.1% since 18 years, compared to 2.7% for men within the prior year and 4.9% since 18 years.

In Demographics of the Gay and Lesbian Population in the United States: Evidence from Available Systematic Data Sources, better demographic estimates are available, with some possible measures of quality-of-life factors (such as home ownership serving as a possible proxy indicator for wealth.) The datasets, however, remain national, and aren't available on a local level for understanding either demographics or quality-of-life, except by imputing national averages to local conditions. And since the authors point out that 60 percent of partnered homosexual couples live in only 20 U.S. cities, national averages for either demographics or for quality-of-life measures likely mean little at a local level.

How can a community move beyond the crude calculation of the "Gay Index" to begin to understand the demographic characterics of its GLBT population? What indicators have you seen that might answer questions of social connectedness, tolerance, civic engagement, or other quality-of-life aspects of a local GLBT community?

Read more ...

Measuring Community Health

The Social Synergy blog has a discussion about measuring community health, and which indicators (in what categories) ought to be considered. Interestingly enough, it's part of a project by a group called Citizen Agency, which bills itself as "an Internet consultancy that specializes in developing community-centric strategies around product research, design, development and marketing." The circle of those involved with community indicators is much broader than I expected.

Incredible work is being done in the area of healthy communities and measuring community health. The Association for Community Health Improvement lists a series of resources available for help in creating indicators here -- add that page to your bookmarks!

Read more ...

Free Mapping Tools

Run, don't walk, to Maitri's Vatulblog for a review of available OpenSource and free visual mapping tools (and a series of comments that are also quite useful.) For those on a limited budget (or who want to support OpenSource applications), this is a must.

What mapping tools do you use to display your data?

Read more ...