The Future Cities Catapult aims to improve policymaking – be it for better flood defences, banking or medicine

When London’s Soho was hit by a cholera outbreak in the summer of 1854 it took a maverick to pinpoint the cause and come up with a way to contain it.

Doctor John Snow rejected the established belief that cholera was airborne and he set out to disprove the miasma - or “bad air” - theory by plotting cases on a map. The physician’s findings proved his theory that the disease was in fact being spread through contaminated water. He traced the outbreak back to a water pump.

Snow’s story is one of the oldest examples of someone asking the right questions, analysing a dataset and throwing up new insights into previously stubborn challenges.

As such it is also one of the favourite tales told by Peter Madden, the man heading the government’s Future Cities Catapult. The centre was established just a stone’s throw from where Snow made his discovery to help modern-day innovators find ways to make smarter decisions about how to run cleaner and more efficient cities in the UK and beyond.

The hope is that in 2016, so-called “smart cities” will map and analyse all manner of statistics, from household waste volumes to social care needs, to better direct their services. For anyone who fancies trying their hand at this blending of big data and big decisions, Future Cities Catapult has created a game for the Big Bang Data exhibition, now running at London’s Somerset House until March.

....

In an era of spending cuts and pressing environmental problems, the potential rewards are huge, but reaping them is by no means simple. There are four obvious challenges.

....

Firstly, there is the sheer quantity of data. As the Big Bang exhibition’s title implies, there has been an information explosion in recent years, as we increasingly use our phones and computers to manage our day-to-day lives and leisure time. With every click, search and selfie, the mass of data expands. Experts predict a 4,300% increase in annual data generation by 2020.

The second challenge is the quality of data. In the ever-growing hoard of information, some bits are more useful than others. Some are more reliable. In the past, researchers and policymakers could navigate this patchy world by putting datasets into a sort of hierarchy, ranging from a dodgy sponsored survey at the bottom up to official statistics at the top.

....

The government has asked the former deputy governor of the Bank of England to lead a review of the country’s economic statistics, and he reports back in March. His interim report published last month found “statistics have failed to keep pace with the impact of digital technology” and he called for the Office for National Statistics to be more “proactive” in producing figures.

Bean’s interim report brings us to accessibility, the third challenge for those wanting to harness the economic power of big data.

....

The fourth challenge is the toughest to tackle, but also the most important - working out how best to use the growing mine of available data. When Snow found a pattern on his map of cholera cases, he already knew what he was looking for. He had a theory.

Exploiting big data will take investment in training and a serious dose of open-mindedness. It will also require some unlikely collaborations. Pairings such as that between a biomedical research institution and the number cruncher who led a data revolution in the world of baseball.