From: The Economist

by K.N.C.

THE idea that representative governments ought make their data open to the public for reuse is by now well established. But what about the processing power of the state? Less consideration has been given to what role and responsibility the nation has to crunch its own numbers. A report released this week by the British think tank Policy Exchange, sponsored by the digital-storage company EMC, takes a few steps forward.

It calls for the British government to develop an analytics unit in the Cabinet Office to undertake “big data” projects across government departments. “Fully capturing the big data opportunity to drive up efficiency and cut out waste in the UK public sector could be worth a total of between £16 billion and £33 billion a year,” Policy Exchange estimates (see chart, below).

 

Second, the report urges the government to adopt a “Code of Responsible Analytics” to ensure “the highest ethical standards.” This is absolutely the right idea, but an utterly mealy-mouthed and impotent solution. Big data by the state could not only be a boon to public services but a nightmare for individual liberty and privacy. After all, when the Nazi’s jackbooted into The Netherlands, they rounded up Jews simply by snatching the country’s famously comprehensive census data. Rather than a “code” or other empty rhetorical flourishes, what is probably needed is law: meaningful checks and balances, transparency of process, independent judicial and political review, and muscular penalties for violations.

Examples of the sorts of ways that big data can transform government appear in a terrific article by Alex Howard of O’Reilly Media on New York City’s analytics programme. In an interview, Michael Flowers, the director of analytics for the mayor’s Office of Policy and Strategic Planning, shed light on a number of initiatives. One was a programme to crack down on over-crowded residences. The city gets more than 20,000 complaints a year. But how to prioritize them to focus on the most severe cases?

Mr Flowers took a file of all 900,000 buildings in the city and cross-tabulated it with data from numerous agencies, such as if property taxes were in arrears, whether there were foreclosure proceedings, the buildings’ ages, and the like. Then, he compared this against five years of historical fire data, coded for severity. The system made correlations that revealed the most problematic cases, such as fire hazards. Before, inspectors would find high-risk conditions in 13% of cases; now they find them more than 70% of the time — a fivefold increase in efficiency. This has even led to a decrease in firefighters’ injuries, since such buildings were 15 to 17 times more likely to result in a firefighter being injured or killed, Mr Flowers told O’Reilly Media.

The O’Reilly story is silent on the issue of the potential misuse of big data by government authorities. But perhaps not for long: the firm’s publishing arm is poised to release a book in August called “The Ethics of Big Data: Balancing Risk and Innovation” by Kord Davis and Doug Patterson.