The American Planning Association's 100th National Conference will be held this year in Las Vegas, Nevada, April 27 through May 1, 2008. From the conference website:
Keynote speakers are the highlight of the National Conference. UN Undersecretary and UN-Habitat Executive Director Anna Tibaijuka looks ahead to the future of sustainable development. ESRI President Jack Dangermond looks at the future from a geographic perspective. And New Yorker architectural critic Paul Goldberger looks back on the seminal work Learning from Las Vegas.
So why is this of particular interest to community indicators practitioners? I'm glad you asked.
Several sessions are devoted to indicators and measurement:
Session S454, Measuring City Performance and the Quality of Life, will be held on Monday afternoon. Clive Graham and Lisa M. Voight will "Explore a World Bank initiative to develop indicators that measure city performance and quality of life in a standardized format across the globe. The Internet-based system allows cross-city comparisons and third-party verification, and enables cities to share best practices and to learn from each other."
Session S610, Green Community Indicators, Diagnostic Methods, and Programs, will feature Jeffrey L. Soule saying "A variety of approaches can guide communities in their effort to become greener. From LEED-ND, to the National Association of Counties sustainability indicator program, to simple checklists, communities are figuring out where they stand. This session looks at the pros and cons of different approaches and offers a look at APA’s tools and techniques."
You've also got S018 Neighborhood Analysis, Visioning, and Planning for Action, S022 How Large Is Your Carbon Footprint?, and S302 An Economic Atlas for Indiana, which features "regional economic indicators and using, interpreting, and applying data from GIS mapping."
Anyone going to conference who'd be willing to take notes and report back?
Community Indicators for Your Community
Real, lasting community change is built around knowing where you are, where you want to be, and whether your efforts are making a difference. Indicators are a necessary ingredient for sustainable change. And the process of selecting community indicators -- who chooses, how they choose, what they choose -- is as important as the data you select.
This is an archive of thoughts I had about indicators and the community indicators movement. Some of the thinking is outdated, and many of the links may have broken over time.
Friday, February 29, 2008
The American Planning Association's 100th National Conference will be held this year in Las Vegas, Nevada, April 27 through May 1, 2008. From the conference website:
CUES (the Catanese Center for Urban & Environmental Solutions at Florida Atlantic University)just sent me an announcement of their Florida Planning Toolbox. I went immediately to the section on benchmarking tools to see what they had to say.
Here's their announcement:
CUES, in partnership with the Florida Department of Community Affairs, is proud to announce the release of the Florida Planning Toolbox, an effort to further regional visioning initiatives in Florida by providing descriptions and examples of planning tools designed to protect and enhance natural resources, promote economic prosperity for all residents, and enable a sustainable quality of life. Tools have been compiled in sixteen broad categories, including agricultural land conservation, benchmarking, climate change, coastal planning, diversity and social equity, economic development, education and health, fiscal analysis & financing, housing, infill and redevelopment, land use planning & development, military-community growth planning, natural systems conservation, public involvement & education, transportation planning, and water resource planning. This issue of CUES News highlights some of the tools contained in the toolbox. The entire toolbox can be accessed in print and web-friendly versions at www.cuesfau.org/toolbox.
Here's what I saw:
The section on benchmarking tools separates out community scorecards and audits and community indicators.
They define "community scorecards and audits" as "A community scorecard or audit is a qualitative monitoring tool used by citizens and public officials to evaluate how well existing policies, projects, and plans meet a set of defined principles or to monitor progress in selected topic areas."
On the other hand, "Community indicators enable a community to understand where it has been and where it is going and identify areas for improvement to achieve a different outcome. Indicators projects use measurable data to shed light on trends (both positive and negative) for a current issue or, more typically, for a combination of issues that affects a community’s quality of life and economic well-being."
In any case, their new site is a nice tool and helps promote the use of community indicators (and scorecards!) in planning, which is A Good Thing. Congratulations, CUES!
Thursday, February 28, 2008
First of all, I like what he says about data in the introduction. We've talked about statistical illiteracy before, and it's important to recognize the issue in your intended audience. Here's what the introduction says:
A Word About Data: Numbers and statistics can be very useful, but they can also seem bewildering. While charts and graphs cannot explain the essence of a city as experienced by its citizens and visitors, numbers and data do provide us with insights that are important in setting public policy. Readers can look at the trends of the various topics and develop a better knowledge of how well the city functions and performs.
Secondly, he includes a map of the area, showing both what is and what is not included within city limits. I like that approach, and think I'll copy it for my next approach. (Though what the good people of Jacksonville are going to do with a map of Santa Fe is a bit of a puzzlement ....)
The third thing I liked was the conversational style in which the data are presented. The tone strikes me as just right -- not too detailed, nor too dumbed-down, the report feels like the reader is being helped to understand the information presented without being talked down to.
Overall, I thought this report had some interesting pieces that readers of this blog may want to pay attention to.
Read more ...
Wednesday, February 27, 2008
There's a working paper by J. Ram Pilarisetti and Jeroen C.J.M. van den Bergh called
Sustainable Nations: What Do Aggregate Indicators Tell Us? The paper, published by the Tinbergen Institute in the Netherlands, examines the different results achieved when using three common measures of nation sustainability, and explores what this might mean.
The abstract says:
What is a ‘sustainable nation’ and how can we identify and rank ‘sustainable nations’? Are nations producing and consuming in a sustainable way? Aggregate indicators have been proposed to answer these questions. This paper quantitatively compares three aggregate indicators of sustainability: the World Bank’s ‘Genuine Savings’ measure, the ‘Ecological Footprint’ and the ‘Environmental Sustainability Index’. It is concluded that rankings of sustainable nations vary significantly among these indicators. Implications of this disagreement for analysis and policy are suggested.
I found the paper interesting -- you might enjoy it as well. It does a good job of pointing out the strengths and weaknesses of each of the three measures, and it notes that some countries fail in all three. But it also points that you can get opposite results using different measures, which is a good warning to all of use using indicators to pay careful attention to the measurement tools that we use.
From the conclusion:
The questions of sustainability of humanity’s consumption and identifying sustainable nations can not be conclusively answered using the three considered indicators. All indicators reflect methodological and measurement problems, and using each of them to rank sustainable nations or commenting on humanity’s consumption may yield erroneous results. Despite the limitations and lack of agreement among the various indicators, it might be worthwhile to check which nations are ranked low according to all indexes, according to EF and ESI, or EF and GS or ESI and GS. Besides the above 11 nations identified as the bottom performers by all indexes, EF and ESI also jointly identify 42 nations as unsustainable; EF and GS jointly consider 14 countries as unsustainable; and ESI and GS jointly view 17 countries as unsustainable. These nations perhaps most urgently would need to critically examine their economic development and environment policies.
Hat tip: Sustainable Options
There's more information available about the State of the USA project. While the project isn't finished, a new website provides an introduction to the work that's going on.
From the site:
People are hungry for objective sources of information. In today’s polarized society, Americans want ways to cut through biased agendas and find reliable facts about the issues that matter to them.
The State of the USA, Inc. (SUSA), a new nonprofit, will offer a website—as a public service—where every American can get the best available facts drawn from the country’s most respected sources. The site will be easy to use and available around the clock, so that people can find credible, relevant data in minutes or hours.
SUSA’s mission is to unite nonprofits, the media, government decision makers, business leaders, scientists, educators and citizens around a single goal: to deepen our knowledge and understanding of the country’s most pressing issues. SUSA will offer Americans a new tool to help them assess where our nation is moving forward and where it has stalled.
Knowing the state of the USA today and acting on that knowledge is our best opportunity to improve America for all generations.
They also have sample pages from the demonstration site you can look at. Take a peek!
The Brookings Institution has just released the Index of State Weakness in the Developing World. Together with the Center for Global Development, they ranked 141 developing countries in the degree of "weakness" (as measured by 20 indicators in four areas -- economic, political, security and social welfare.)
Under their "economic basket" they measure a country's GNI per capita, GDP growth, income inequality, inflation, and regulatory quality.
Under the "political basket" they measure government effectiveness, rule of law, voice and accountability, control of corruption, and freedom.
Under the "security basket" they measure conflict intensity, gross human rights abuses, territory affected by conflict, incidence of coups, and political stability.
Under the "social welfare basket" is child mortality, access to improved water, undernourishment, primary school completion, and life expectancy.
The countries that ranked as the "weakest states" were Somalia, Afghanistan, Congo, Iraq, and Burundi. Not a lot of surprises there. But the report itself (and the scores per country) and fairly interesting, and you ought to take a look at this report.
Read more ...
Here's an example of an amusing trend -- folks who are using graphing and charting tools to illustrate popular songs. You may recognize this song on the left as California Dreaming. (Click on each picture to enlarge.)
This site has 140 more songs similarly illustrated, including the great classic from Evita shown below using mapping technology:
Check out the graphs, and see how many you can identify. (Warning: Most are apparently current popular music, which left me feeling old and out of touch when I didn't recognize some of the songs.)
Hat tips: Information Aesthetics and Gawker.com
Here's a note from some folks over at GeoAge that I thought some of you might be interested in.
Rapid data collection and data sharing enables multi-disciplinary approach to community preservation research projects
Traditionally, university research projects involve reams of paper forms that require many hours of data entry, re-entry, printing and filing. In most cases, the printed reports remain in boxes or cabinets … the data isn’t shared or used as it could have been. Following hurricanes Katrina and Rita, the CADGIS lab at LSU decided to capture research data on New Orleans historical / cultural preservation with the latest mobile technologies; and further, to make that data available to other departments within LSU.
LSU’s CADGIS lab, under the direction of Dr. Barrett Kennedy, comprises design, art and anthropology. Its focus in New Orleans included studies of historical, cultural and community preservation and recovery in the Lakeview and Holy Cross neighborhoods – nearly 2500 historic structures. To collect data quickly, then report and map the data, CADGIS used GeoAge FAST Designer software on its laptops and GeoAge FAST PDA on its Dell PDAs in the field. The research team captured digital photos with Ricoh cameras, GPS points and a wealth of condition data via FAST data collection forms. Back at the CADGIS lab, data was downloaded for instant reports and maps.
“A critical element of this data is the ability to share with other disciplines at LSU,” noted Dr. Kennedy. “Researchers in sociology can access and extend what we’ve already collected.” LSU’s CADGIS team has been successful developing a rapid-response, GIS-based solution for managing cultural heritage resource data and using GeoAge’s FAST software to integrate that data with natural resource data for more effective disaster response, mitigation, and recovery actions. According to Dr. Kennedy, “We’re confident that LSU’s ongoing research efforts will spawn significant opportunities to develop efficient, cost-effective, integrated approaches to future natural and cultural resource management situations.”
Louisiana State University’s CADGIS team applied a multi-disciplinary method using FAST software on their Dell PDAs … deploying electronic data forms for rapid data collection, digital photo attachment and GPS information in several New Orleans neighborhoods. Data is downloaded in CADGIS’ Baton Rouge lab, exported to several reporting and mapping applications, as well as shared with other academic departments involved in post-Katrina community preservation research projects.
For more information, see http://www.geoage.com/solutions.html and the videos at http://www.geoage.com/videos.html.
Tuesday, February 26, 2008
You're probably familiar with the work that Clifford Cobb and Craig Rixford did while over at Redefining Progress. Ten years ago, they published Lessons Learned from the History of Social Indicators (PDF) that did a great job of summarizing what we knew. The lessons are still highly relevant, so I was pleased when I caught a nice summary of their work at Rashid's Blog. I encourage you to read the article (it's only 40 pages), but if you're pressed for time here's the summary.
- Obtaining a figure does not equate to establishing a good indicator. This is because quantities should reveal qualities, but qualities are “ always ambiguous and that any statements about them are provisional rather than final.”
- Effective indicators require a clear conceptual basis. Ideally, concepts should be defined before data is collected, but in practice that is not easy. On the other hand, “although measurement can help clarify a concept, the concept itself will not simply emerge from data.”
- No indicators are free from values, because “ all serious indicators work is political”. Value judgements prevail from selecting indicators to the formulation of survey questions.
- Comprehensiveness may tamper with effectiveness, because historically the most effective indicators tended to focus on one single issue, by guiding people to consider deeper questions. In additions, interpreting indicators is more important than simply describing them. Indicators covering a small area for specific audiences tend to be more effective.
- The symbolic value of an indicator may outweigh its value as a literal measure, especially the fact that indicators serve as metaphors, not statistics.
- Indicators must not be confused with reality, because “even the best indicator is only a fractional measurement of the underlying reality”. Multiple indicators measuring the same social phenomenon may overcome this problem.
- Democratic establishment of indicators requires more than good public participation processes, the result of emphasis on procedural justice. This practice will effectively suffocate changes to status quo, and perhaps substantive justice, such as equal opportunities, should be emphasised instead.
- Measurement does not necessary include appropriate action.
- Better information lead to better decisions and improved outcomes, but this is not easy. This is because indicators only have indirect effects to policy making, and behaviour plays a larger part.
- Innovative thinking on causes of a particular social problem is often required to resolve this problem, because indicators have a function of enlightenment, which are supposed to lead people to reconsider the common understanding of the problem.
- Indicators that reveal causes, not symptoms, of a particular social problem should be searched in order to take action. Mere description of indicators without providing insight to trends is more unlikely to lead to remedial action.
- Indicators provide the basis for setting outcomes as long as one has control over resources, that is if developers of indicators “ have a connection to those with the power to make substantive changes.” This brings us to determine who actually has the power to take action.
Read more ...
Thursday, February 21, 2008
My other project, indicatorscommunity.com, is now moving into its beta version. Thanks to those who provided feedback so far on which tools/widgets to provide, the interlinkages with www.communityindicators.net, and the need for clearer Help and FAQ pages.
Now I need people willing to register a user ID and start commenting on posts, adding their own information, promoting their own projects, creating their own groups, starting their own wiki pages, launching a new forum, pulling their blog in through RSS feeds, or anything else that strike their fancy. The goal of the project is to have a one-stop networking hub that links all of the good work out there on the web in relation to community indicators (and government performance measures and sustainability indicators and triple bottom line work and all the interrelated ways we're using data and indicators to make change and measure performance.)
This past not-quite-a-year on this blog has convinced me that there's a lot of folk doing a lot of really good work that ought to be able to connect with each other. Much of that work is happening outside my own country (US), and so I'm hoping the language tools built into the new site will allow people to connect and discuss their work in their language(s) of choice -- the platform should allow that, but again I'm looking for people to join in and help me see where any bugs might be.
The formal roll-out of the site is expected to happen in June. Between now and then, I need some hardy pioneers -- some folks who want to get in early on the conversation and help create the network's "culture" of interactions.
If you're interested, go to indicatorscommunity.com, pick a user name, and sign in. It should be that easy. (If it's not, let me know how I can make it easier.)
Thanks for the help!
Sunday, February 17, 2008
We talked about data on religions last year, and even had an example where a community indicator on religion told important information about the kind of community people lived in. With the recent emphasis on religion and voting in the primary elections in the United States presidential race, this map on regionalism and religiosity may be interesting.
There are other maps that use similar data to describe religious adherents at the county level in the United States. This map shows percent of the population that is Jewish, and this one Muslim, and this one Catholic; they're part of a series of maps found at the Glenmary Research Center.
The methodology of the research, also with some cautionary notes, is available here.
Someone else recreated the map using this set of data and methodology, which he explains along with the coding he used. He says, "The one advantage of my SAS/Graph map is that you can hover your mouse over the counties, and see the county name & the numeric value!"
For more data at an international level, you might find this interesting.
Do you include indicators of religiosity or religious service attendance as part of your community indicators project? I think it might be interesting to see which communities do, and if there is a geographical pattern in those communities that consider religious beliefs a critical measure of the quality of life in their community.
Friday, February 15, 2008
I read an interesting blog post this morning from Joe Swanson, who's apparently a candidate for political office in Wisconsin. I am not in Wisconsin, and do not publicly endorse candidates, and have never met Joe. I'm sharing with you what he wrote because of the clear way in which he outlined the need for community-based indicators of sustainability as a key influence in policy-making and political decisions.
In a post called simply "Sustainable Community", Joe wrote:
If we are to change the outcome of governmental process we must find a way to introduce logical and factual data into each and every decision. The process would need to be governed by criteria/indicator developed locally. This process would remove political consideration from the decision making process. It would evaluate decisions concerning communities by checking each decision against criteria designed by the community to best utilize human and natural resources lessening the burden on infrastructure and environment. This holistic approach would help insure that our communities remain sustainable, therefore insuring the economic, environmental, and human viability of our region.
He then defines "sustainability" for his readership as follows:
Sustainable development is often misinterpreted as focusing solely on environmental issues. In reality, it is a much broader concept as sustainble development policies encompass three general policy areas: economic, environmental and social. In support of this, several United Nations texts, most recently the 2005 World Summit Outcome Document, refer to the “interdependent and mutually reinforcing pillars” of sustainable development as economic development, social development, and environmental protection.
He also identifies fourteen steps in developing community indicators of sustainability. His list demonstrates a thoughtfulness in including a broad set of quality-of-life indicators in understanding trends in the community.
In all, it's a pretty good argument for locally-developed community indicators as necessary resources for public decision-making. Do your elected leadership feel the same way?
Thursday, February 14, 2008
I needed to show you two more examples of how information is conveyed visually to open eyes, minds, and hearts. Take a moment to check out both of these sites and think about how you share information with your audience(s).
The first is What We Eat, a photo essay from Relevant Magazine. A simple chart of food expenditures per person in different countries around the world is interesting -- there's one in the article comments you can use as a comparison table -- but the photos tell the story with much more vibrancy.
The second is from the Information Aesthetics blog, which I've mentioned before (and linked to in the left-hand column -- a must-read, in my opinion, for their way of showing how data visualization could be so much more than we usually do it.) The article is called Rice Population Demographics, and it has photos from a museum exhibition using grains of rice to physically demonstrate different population demographics. See a video here using this concept.
From infosthetics: [This is] a physical visualization of the world population demographics, by mapping 1 grain of rice to represent 1 human being. in the "of all the people in all the world" installation, population statistics are separated out in different piles to juxtapose compelling social phenomena, such as the comparison of all the prisoners in the world versus all the people in gated communities (roughly equal).
Check it out.
Last year, we talked about Michael Kruse's series on social indicators. He's now launching his American Social Indicators, 2007, and you may be interested in what he has to say.
The question he asks is, "Is America in decline?" He describes his methodology thusly:
I'm going to look at some statistical indicators from demography, sociology, economics and other fields that social scientists look to as broad indicators of quality of life. Depending on the indicator, we will be looking at timeframes of 30-50 years to get a sense about what trajectory things are on. I am sure you have heard the expression, “It’s easy to lie with statistics.” That's true. But it is even easier to lie without them! Quantified observations are a good place to start a conversation about social forces at work in the culture.
What I like about his work (besides the great line -- "it's even easier to lie without them!") is the sense of perspective he adds to the debate. For example, in our community we're working to address the infant mortality rate. It's a critical task, and the problem is not to be taken lightly. Kruse points out, and rightly so, that the infant mortality rate has dropped more than 75% over the past fifty years. That's worth noting, even as we wrestle with racial disparities in infant mortality rate and the knowledge that in the U.S. (especially in comparison with the developed world) the rates could (and should) be lower.
So take a look at Kruse's series. You may find something surprising in the mix. It's certainly food for thought.
Wednesday, February 13, 2008
We've been talking about how to use information to tell community stories more effectively. One key aspect is engaging the media so that they can use your community indicators reports in shaping community understanding of the issues and promoting accountability for action. (You did read this post, didn't you? I'll link to it again to give you no excuse not to click through and see it.)
Swivel's blog, Tasty Data Goodies, has a nice piece on journalists and data. I want to highlight a few sentences from their article that I think we need to pay attention to.
[N]ewspapers, magazines and blogs have increased their use of data. This has resulted in some very creative and beautiful uses of visualizations by journalists to tell their story. ... With journalists having easy access to data and analysis tools we hope to see more of this.
And as the demand from journalists increases, we are observing an increased willingness among data providers to be more creative about how they are giving access in the first place. This is a good trend where both sides benefit. Journalists are able to give compelling evidence and data providers see their data used more broadly, potentially impacting more people.
Working with the media as a trusted source for both data and key stories they should be reporting on is, I think, a critical responsibility of community indicators practitioners. Data need to be seen and heard. Data need to change people's minds and government's policies. The media needs to be our partner in making that happen.
Read this article as well. Matt Croydon says, "One thing that I’ve been seeing a lot recently in my interactions with the newsroom is that we’re no longer exchanging Excel spreadsheets, Word files, and other binary blobs via email. Instead we’re sending invites to spreadsheets and documents on Google docs, links to data visualization sites like Swivel and ManyEyes, and links to maps created with Google MyMaps."
How are you working with the media? How do you share data more effectively? How have you made the media your ally in community change?
The Community Indicators Consortium (CIC) just sent out its February 2008 newsletter, and I thought I'd share some highlights with you. (CIC's newsletters are available online here -- and you can ask them to be put on the e-mail list to get future newsletters as released.)
The newsletter includes a write-up of the latest ISQOLS (International Society for Quality-of-Life Studies) conference in San Diego, the OECD World Forum in Istanbul, and the Beyond GDP Conference in Brussels. It also includes a summary of the 2007 accomplishments of the Community Indicators Consortium in a message from the president.
There's an additional message to pay attention to. King County, Washington, has been doing some really interesting work in integrating performance measures and community indicators projects. Their effort, King County AIMs High: Annual Indicators & Measures, is a big step forward in this area. Read the article by Michael Jacobson, King County's Performance Management Director, and check out the website. This intersection of citizen governance and government accountability is one of the puzzle pieces that we should be working to put together over the next few years.
There's an exciting new website out there you should know about -- Track-n-Graph. It describes itself as "a FREE web-based service for friends, family, and co-workers to track and graph information." It's straight-forward and simple to use -- I'll be exploring it further in the next few days.
I found out about it through the Information Aesthetics blog, which says the following about Track-n-Graph:
a web-based service to track & graph information. users can create, sharer & embed their own data trackers as well as customize existing visualization templates, such as a weight tracker, a child growth tracker, a calorie counter, a workout tracker & a blood pressure monitor. currently offered visualizations methods include bar, line & area graphs.
track-n-graph is another social visualization website, quietly filling the space between Many Eyes & Swivel & the charting APIs from Google or Yahoo.
I am firmly convinced that technological advances in the way we obtain, display, and share data are quietly revolutionizing the field we work in, and the only way to maintain relevancy is to stay on top of this work. My organization has been publishing an annual community indicators report since 1985. Our reports changed little between 1985 and 2001 in format, structure, design, or display -- they were consistent, professional, clear, and accurate. But no one accused them of being exciting.
We've been working on improving the usability and readability of the reports every year. But the last 12 months has seen an explosion in alternative technologies for presenting information to overcome fear of statistics, and has also seen a rapid increase in the use of data in everyday life. We've been chronicling some of that in this blog. This should be seen as a call to action to do more to connect information with people in meaningful and provocative ways so that data can transform communities.
Keep sending me your tips, news flashes, blog articles, project reports. Let's talk about them and what the next 12 months may bring us.
Tuesday, February 12, 2008
Today seems like a good day to look around and see what other people are saying about indicators. A quick blog scan shows the following:
Nancy Kress looks at quality of life indicators, their relationship to oil consumption, and whether we are surprised by the correlations.
The United Nations Environment Programme has a new chart available on the global human development indicators, a portion of which is reproduced below:
The Jurga Report reminds us that Progressive Farmer magazine has named the top rural places to live in America, using a series of 10 indicators. Fran Jurga says, "The top 10 rural counties are ranked based on rural quality-of-life indicators such as great schools, access to health care, low crime and affordable farmland. In 2008, the editors of The Progressive Farmer added extra criteria by focusing on counties that have been able to protect farmland, control growth pressure from urban and suburban areas, and strike a good balance between agriculture, manufacturing and modern conveniences."
There's a new article on Woodpeckers as reliable indicators of bird richness, forest health and harvest which I'm trying to think about as a possible companion to the warning about frog-dumping.
EcoSpace Conscious Community discusses "creating the conditions for sustainability to happen", based on Permaculture Design. "Leading designer and permaculture teacher, Larry Santoyo, calls these characteristics “the indicators of sustainability”. “If we use these indicators as a checklist in our own lives, we, too can start to become sustainable”, he says. By examining the systems in which we participate (home, school, work, play, community, etc) and seeing where improvements can be made, we can create the conditions for sustainability to happen."
Christian Renaud suggests that New York merchants accepting euros is a leading indicator of a global currency shift. Another article on gender indicators at policy, programme, and project levels deserves a look. Yet another discusses New Energy Indicators for Transport:The Way Forward.
And that's just a sampling from this week. The web is abuzz with indicators. This suggests, to me, the need to bring together these conversations to create something more than isolated conversations. What do you think?
Read more ...
Monday, February 11, 2008
It probably doesn't surprise the Gentle Reader of this blog that I pay attention when people bring up how they could use data to make better decisions. What continues to astonish me is how easily data-driven decision making and the use of metrics to measure progress has made its way into popular culture and everyday conversation. There's an important lesson for community indicators practitioners here: people aren't as afraid of data as they used to be, nor do they relegate data to the province of "experts."
That being said, I need to share a couple of paragraphs from a recent article just in time for Valentine's Day. In the Washington Post today, in an article titled A Dater's Bill of Rights, we read the following:
You have the right to regular status updates
That is, the "where do we stand?" discussion. I despise uncertainty. I think the problem is that in dating, there are no metrics. In sports or business or politics, there are clear ways to measure success -- won-loss records, quarterly earnings, vote totals. In the early days of a budding relationship, when you don't yet know the other person's signals or how best to communicate your own feelings, sussing out where things are going can be like reading skywriting on a windy day. Skywriting written in Sanskrit. And you're not even looking up.
To remedy this, I think both parties should exchange formal reports once a week, detailing attraction levels, the effects of various factors, etc.: "I'm happy to report that my affection for Ryan is up 31 percent this week, continuing a steady three-week growth of increases (see chart on p. 34). Picking me up, with flowers no less, in the first quarter of week three caused a 25 percent increase alone. Since we have become intimate, Ryan has shown a lot of potential and a real go-getter attitude in the sack, as well (see chart labeled "Satisfaction" p. 36)."
Charting affection levels? Metrics for dating? Welcome to Valentine's Day for the data-driven.
Anyone have any stories to share?
Sunday, February 10, 2008
Over the last year, we've covered some amazing ways to visualize data on this blog -- click the "data display" tag at the end of this post to refresh your memory, or check out some of the examples here, here, and here. Now I've got a new website to talk about, and I'm quite excited (and awed) by it.
We've mentioned Chris Jordan's work before, in the context of telling stories with very large numbers. How do you connect your audience with the data when the numbers are so big?
Chris shows us how.
Whether it's recreating Seurat's A Sunday Afternoon on the Island of La Grande Jatte or simply showing us how disposable our society has become, the artwork displays numbers and scales tht both call our attention to the individual (a cigarette, a Barbie doll, a folded prison uniform) and the sheer magnitude of the number of individuals affected.
Please check out the work. Then let's think together about what this means for our community indicator reports and how we communicate statistics in ways that are meaningful.
Friday, February 8, 2008
We've been talking about measuring happiness for quite a while -- from global conferences on happiness to measuring prosperity and well-being. A new book, Against Happiness, appears to challenge the movement to measure happiness in interesting ways.
Railing against the psychology of positive thinking, the "science of happiness," the author says:
What are we to make of this American obsession with happiness, an obsession that could well lead to a sudden extinction of the creative impulse, that could result in an extermination as horrible as those foreshadowed by global warming and environmental crisis and nuclear proliferation? What drives this rage for complacency, for the innocuous smile? What fosters this desperate contentment?
The argument is that sadness, suffering, melancholy, whatever you name the non-happy, non-complacent times in our lives, is where we find meaning, creative impulses, strength, poetry, and more. If happiness is our goal, chemical shortcuts can get us there quickly, to our ultimate detriment.
In an essay titled "Please Don't Have a Nice Day," printed in the Wall Street Journal, Colin McGinn says:
What about the stronger thesis -- that misery can still have value even when it leads to deeper misery, so long as wisdom is the outcome? A more convincing argument against happiness might defend the view that knowledge of the true condition of the universe, and of our place in it, necessarily gnaws at the heart and that such gnawing is good in itself.
What do you think? Are we inadvertently denying the human spirit when we measure happiness indicators and encourage public policy that reinforces happiness? Is the New Economic Foundation's Happy Planet Index as destructive as the climate change it hopes to solve? What do you make of the argument that promoting a misery index might be a good thing?
Thursday, February 7, 2008
I'm working on a new social networking site concentrating on community indicators. I'm looking for anyone who would like to help build this online community -- while coding help is nice, what I'm really looking for is people to test-drive the site and tell me what works and what doesn't and how I can make it easier to navigate, simpler to use, and more useful to gather and share conversations around community indicators.
If you're interested, come check out www.indicatorscommunity.com. The site is being built on an open source platform, which should give us more flexibility to make current changes and to add/adapt widgets and interface with other sites as we move forward.
I'd appreciate your feedback on the site. Specific suggestions can be made here or in the site development group on the site.
Thanks so much! We'll do a more formal launch later for those who want us to get it right before they join in. But I suspect there are a few readers of this blog that would be interested in being in on the ground floor of building an online community and helping set not just the tools but the community norms as we move forward.
If you have any questions about the project, please drop me an e-mail. Thanks again for your support.
Posted by Ben Warner at 5:59 AM
This is a bit of a departure, and perhaps not all of you will find this post useful in your community indicators projects. That's OK. One of the key messages we share with our communities is the importance of data, and a second is that data are everywhere and we need not be afraid to make decisions based on data.
That message has now been embraced by fans of the pop culture phenomenon known as American Idol. If, as the show's judges comment, song choice is critical to a singer's ability to move forward in the competition, wouldn't data about song choice be nice to know? Glad you asked! For all those that Ken Barnes, USA Today Idol Chatter columnist calls "number enthusiasts" (preferring the term to "data geeks"), here's a website for you: What NOT to Sing.
The database contains all the songs sung on American Idol in seasons 2-6, with who sang them, what the results were, and more -- data sets sorted by contestant, artist, song, performance, season, and episode. So a contestant could look at the data and find a song that hasn't been overperformed, has been generally well received by the judges, and has not yet had a "signature performance" against which all future performances would be judged.
OK, you may not find the actual database interesting if you're not a fan of the show. But the desire for data that drove the creation of the database, and the methodology in creating the resource -- those are interesting.
So what's the point? Data are everywhere. Data are being used in non-traditional ways in non-traditional fields by non-traditional data users. And they'refinding data to be fun and useful in making decisions.
What's not to like about that?
Wednesday, February 6, 2008
A new study, Tracking Health and the Environment: A Pilot Test of Environmental Public Health Indicators looks at the relationship between environmental indicators and public health measures. In the study, the Johns Hopkins Center for Excellence in Environmental Public Health Tracking assembled three sets of indicators to test the relationships.
From the abstract:
To advance the use of indicators, the Johns Hopkins Center for Excellence in Environmental Public Health Tracking piloted three pairs of indicators: 1) air toxics and leukemia in New Jersey, 2) mercury emissions and fish advisories in the United States, and 3) urban sprawl and obesity in New Jersey. These analyses illustrate the feasibility of creating environmental hazard, exposure, and health outcome indicators, examining their temporal and geographic trends, and identifying their temporal and geographic relationships. They also show the importance of including appropriate caveats with the findings. The authors' investigations demonstrate how existing environmental health data can be used to create meaningful indicator measures to further the understanding of environment-related diseases and to help prioritize and guide interventions. Indicators are the foundation of environmental public health tracking, and increased use and development of them are necessary for the establishment of a nationwide tracking network capable of linking environmental exposures and health outcomes.
Not surprisingly, I like the idea of wider use of indicators to understand trends and relationships. I also like the idea of trying to get outside of a single-field area of focus to look at the broader set of indicators to see the interrelationships.
This is an important step, but I suspect only a first step. As the work progresses, I would like to see greater studies on how public policy decision-making about land use and transportation planning, for example, or social service provisions, or economic development policy affect both environmental and public health indicators.
The message that community indicators practitioners understand instinctively and are trying to share with the world is that we live in an interconnected system where our actions have consequences. Only a comprehensive look at indicators of the quality of life in a community has a chance to understand where we are making progress, where we struggle, and how our decisions affect the future of our community.
Here's how the authors conclude their study:
Several lessons can be drawn from the development of the indicators presented in this paper. First, indicator development is restricted by the availability, reliability, and consistency of data. Second, multidisciplinary expertise and collaboration are needed to design and track indicators that will be useful for policy. Finally, because indicator projects may not be controlled studies, their results are often difficult to interpret. Great care must be taken in the communication of findings about environmental exposure and disease relationships to the public. Words should be carefully chosen, caveats should be highlighted and repeated, and clear legends should be placed on every graphic. For linkage indicators examining both hazard/exposure and outcomes, it must be emphasized that conjunction or lack thereof provides only exploratory and potentially suggestive data about distributions and trends. Controlled analyses are generally needed to draw firmer conclusions.Read over the study, and keep forwarding these kinds of articles!
Read more ...
Tuesday, February 5, 2008
There's a great article by The Numbers Guy on How Setting Goals Can Improve Data. The article is of particular relevance to community indicators practitioners, and I urge you to read it and think about what it means to the goal-setting (or target-setting) efforts you might have associated with your indicators reports.
The point he makes is simple, and it's based on the example of the United Nations' Millenium Development Goals:
About two-thirds of the way to the target date of 2015, the world is behind pace to reach most of these goals. Yet the setting of the goals has ensured, at least, an effort to improve the data collection needed to monitor the progress.
He then outlines some of the progress made in developing stronger measures to ensure better data. The same reasoning tends to hold true in local communities: increased attention to an area of emphasis, with reporting of data and setting of communtiy goals, has in our case led to significant improvements in data-gathering and data-reporting capacities in local government.
For more information, check out what The Numbers Guy "wrote last year about measurement of another millennium goal, reducing world poverty. A U.N. Web site offers maps tracking progress toward the millennium goals, by country."
Saying, "You cannot manage what you do not measure," Governor Charlie Crist launched Florida Performs, a website that reports "how Florida is doing in areas that affect the quality of life for you, your family, and your neighbors."
The website covers performance goals and indicators in the areas of Public Safety, Health & Family, Education, Economy & Taxes, Transportation, and Environment/Conservation.
The site also provides a useful Scorecard at a Glance, using red, green, and yellow arrows to give a quick visual picture of performance in each of the areas measured.
Under each area is a stated goal, a section on "Why is this important?", "How is Florida doing?", a scorecard with a set of indicators, a section that discusses what influences this area, a section called "What is the State's role?", and links to where the reader can go for more information.
Under each indicator is the measure label, measure title, department responsible for the data, data code, measure definition, graph of the data, goal direction, posting frequency, calculation type, raw data, measure owner (name and e-mail and phone contact information), and other little tidbits. It is as nice an example of metadata I've ever seen -- not just where the data comes from, but who to talk to if you have any questions. Very, very nice.
Check it out!
Saturday, February 2, 2008
I am very excited about the new publication from Claudia J. Coulton titled Catalog of Administrative Data Sources for Neighborhood Indicators (PDF). This is not only a tremendous resource for neighborhood-level statistical data sources, but also a sign of the growing maturity of the neighborhood indicator movement.
The report begins with a discussion of recent developments in neighborhood indicators, and then launches into the reasons for using administrative data as a key source for very-local community indicators (one of the most important reasons being that the data are often available at the neighborhood level, which is not true of all data out there.) The report then walks the reader through a series of issues in using administrative data for neighborhood indicators, including those of geographic boundaries, confidentiality, data accuracy, metadata, matched and longitudinal files, mobility, and commercial data products.
The bulk of the report is concentrated on 42 administrative data sources on topics ranging from economy and environment to health and public safety.
I highly recommend this report as a standard for anyone working with community indicators. Go download this report right now. For an understanding on why this information is so critical, check out Neighborhoods at the Tipping Point.
Not convinced yet? Here's a portion of the introduction to the Catalog of Administrative Data Sources for Neighborhood Indicators:
THERE IS A LONG TRADITION OF USING data collected for administrative purposes to produce social and economic indicators (Rossi 1972; Annie E. Casey Foundation 2005). Indicators are measures of the condition or status of populations or institutions that can be compared over time or between places and groups. In recent years, there has been growing
interest in developing indicators for communities and neighborhoods that can be used to improve local conditions or support action by groups and organizations that work at that level.
Community indicators are employed by neighborhood associations, local governments, businesses, nonprofit agencies, researchers, youth groups, and other individuals and organizations. Indicators have been successfully used to identify problems, plan programs, stimulate action, advocate for change, target investments, evaluate initiatives, and otherwise inform the community about itself (Cowan and Kingsley forthcoming).
The data used to craft neighborhood indicators often come from administrative agencies. Administrative records are particularly useful for community indicators because they are timelier or can be applied to smaller areas than government surveys. Moreover, the application of geographic information system (GIS) technology to these records makes it feasible to calculate many indicators for small areas and to display them in useful ways. Many sources and types of data from administrative agencies can be used to produce measures useful to neighborhoods and communities.
This monograph describes these data sources because such information is not readily available in a comprehensive review elsewhere. Most databases described here are maintained by local agencies, but a few state and federal databases can also be used for small-area measures.
Check it out, and let me know your reaction to the Catalog of Administrative Data Sources for Neighborhood Indicators.
Hat tip: American Federation of State, County & Municipal Employees (AFSCME) Information Highway