Six common errors when using data: Part one

Published: 23 Apr 2018

Computer and data [square]With such a volume of information available to planners nowadays, it's essential to be aware of the pitfalls data can present.

Data is exerting an ever-stronger influence over the work of planners. Whether you’re putting a proposal to a committee or client, or presenting evidence to an appeal, the chances are you rely on information gathered from a variety of sources.

Research and evidence are the backbone of what planners do. But we live in the age of ‘big data’, with information coming at us from all angles and multiple sources. All this insight is useful, but how do we sort, interpret, and act on such a volume fairly and accurately?

It seems that planners will have to become more discerning in their analysis and more disciplined than ever – not least because plenty of others will be using the same data to support conclusions that suit their agenda.

One starting point is to know what not to do. Here are the first three of six common pitfalls to avoid when using data.

1. Treating all sources as equal
When consulting on a proposal, should you give the same weight to the personal anecdote of an individual, however important they are in their community, as you would to a survey of 50 residents? Should you value equally data gathered by a commercial business and information drawn from a peer-reviewed study? You can only answer that by knowing context – the who, what, why and how of the data and its origin. There’s room for all kinds of evidence when compiling a case, but not all is of equal value. Fairness in decision-making is contingent on recognising the relative quality relevance of your sources.

2. Cognitive bias
We all have assumptions and preferences that influence our decision-making – and social scientists have identified a range of different types of cognitive bias, from ‘anchoring bias’ (reliance on the first piece of information) to ‘conservatism bias’ (reluctance to accept new evidence) and ‘confirmation bias’ (prioritising results that confirm one’s world view over those that do not). Planning is full of biases, many of which are passionately held and can cloud decision-making (arguments about green belt are a good example). Use strict, impartial methodologies to stick to what the data tells you, not what you want it to say.

3. Making exact forecasts from data
Planning is implicitly about anticipating scenarios based on past data and current challenges. Forecasting is an integral element of what planners do – but no matter how good your figures, any ‘static’ forecast is limited. There are unpredictable events; the model you use may be too rigid for the data; the decisions you make may actually change the forecast. Then there’s our natural tendency to favour ‘expert’ forecasts over the predictions of others – even though research has shown that government officials, academics, journalists et al perform worse than ‘dart-throwing monkeys’ (1) when predicting outcomes. How do you accommodate uncertainty within forecasts?

 

Data quality

The quality of outputs in planning is contingent on the quality of inputs (otherwise known as ‘garbage in, garbage out’).

Furthermore, we are bound to plan for the future based on past events and present needs. When using data to inform planning, there are inevitably unknowns.

Writing in Data Points: Visualization That Means Something (2), statistician Nathan Yau suggests that data should perhaps have its own “golden rule”: “Treat others’ data as you would want your data treated.”

He adds: “Data is an abstraction of real life and can be complicated, but if you gather enough context, you can at least put forth a solid effort to make sense of it.” 

The Alliance for Useful Evidence’s guide to using research evidence cites the study by Professor Philip E Tetlock of 80,000 expert predictions that found the majority were wrong and ‘dart-throwing monkeys’ were better at forecasting the future. (pdf)

See the Big Think website

Six common errors when using data: Part two

Image credit | Shuttershock

Back to listing