Community Indicators for Your Community

Real, lasting community change is built around knowing where you are, where you want to be, and whether your efforts are making a difference. Indicators are a necessary ingredient for sustainable change. And the process of selecting community indicators -- who chooses, how they choose, what they choose -- is as important as the data you select.

This is an archive of thoughts I had about indicators and the community indicators movement. Some of the thinking is outdated, and many of the links may have broken over time.

Sunday, September 2, 2007

Data Driven Decision Making: Seven Tips

I was reading an interesting piece titled Seven Steps to Creating a Data Driven Decision Making Culture and thought there were some ideas that would apply to the field of community indicators. The article is focused on using web analytic tools to improve the bottom line, but the lessons about how to use information to make decisions are broadly applicable.

Number seven is Go for the Bottom Line (Outcomes). Community indicators efforts often are tempted to measure inputs -- per-pupil spending, for example, or number of police officers on the street -- which are easier to change but may not result in the outcomes we're looking for.

Number six is my favorite -- Reporting is not Analysis. I like the breakout on how data funnels down:

› Data : petabytes› Reports : terabytes› Excel : gigabytes› PowerPoint : megabytes› Insights : bytes› One business decision based on actual data: Priceless

I also like this point:

‡ Reporting = the art of finding 3 errors in a thousand rows
‡ Analysis = the art of knowing 3 errors in a thousand are irrelevant

The article continues: Analysis in our world is hard to do, data data every where and nary a insight any where. Reporting is going into your favorite tool and creating a bizzilon reports in the hope that a report in there will tell you, or your users, will spark action. That is rarely the case.
An additional challenge is that both reporting and analysis can take over your lives, you will have to make a explicit choice as to what you want to spend time on. Remember that if at the end of x hours of work if your table / graph / report is not screaming out the action you need to take then you are doing reporting and not analysis.

Number five is Depersonalize Decision Making. The stand-out comment to me was "HiPPO's rule the world" -- Highest Paid Person's Opinion. When we deal with community change, you often find yourself with a BFD -- Biggest Funder Decides. You can't get to a personality-led and data-driven decision making process -- which one decides?

Number four is Proactive Insights rather than Reactive. I know this is obvious, and I'm sure you probably feel like I do that the next person who says "we want to be proactive rather than reactive" needs to be shoved in the same box with the "let's try out-of-the-box thinking on this one" person. So I'm skipping past it.

Number three is Empower Your Analysts. Yes! And pay them better, too. 'Nuff said.

Number two is Solve for the Trinity. No, we're not getting all Dan Brown here. This is a model of problem-solving that looks incredibly useful. Data is only the What. Research adds the Why. Outcomes are the How Much. Here's the tip: Start with the How Much, evolve to the What Is then strive for the Why (or why not if that is where you find yourself : ).

Number one is Got Process? Data driven decision making demands paying attention to process. Here's how the article puts is:

This is perhaps the single biggest differences between cultures that achieve the mythical status of being data driven and those who languish. Process helps create frameworks that people can understand, follow and, most importantly, repeat. Process Excellence (six sigma) can also help guide you and ensure that you are actually focusing on the Critical Few metrics and help establish goals and control limits for your metrics so that it becomes that much easier to stay focussed and execute successfully.

Processes don’t have to be complex scary things. The picture shared was that of a simple powerpoint slide that using a very visual flow illustrated exactly what the process for executing a a/b or multivariate test was, end to end. It showed who is responsible for each step and what deliverables are expected. Very easy to do. But now not just you but everyone knows that to do. At the end of the day it is process that creates culture, do you have structured processes in your company?

And that's why I thought an article about using clickstream presented at an emetrics conference had such applicability to the community indicators field.

Read the article, and then drop me a comment -- does this make sense? Or am I stretching things too far?


Post a Comment