Treasury may mine sentiment data

From: ITWire

By Beverley Head

According to Treasury chief information officer Peter Alexander a potential trial is on the cards.

Speaking at Cebit’s big data conference in Sydney today Mr Alexander said that it was important organisations distinguish between what amounted to “biggering” of data – which amounted to traditional analysis of ever larger collections of data – and true big data analysis which would involve analysis of a much broader range of often unstructured and external data sources.

He said that for an organisation like Treasury which had the main objective of forecasting in order to inform policy making, it might be possible to get better trending information by bringing into the mix real time retail, crime or traffic statistics. “You start getting better trending based on real time data,” said Mr Alexander.

He said that at present Treasury models were largely based on historical data. Mr Alexander said that it might be possible to make use of sentiment analysis to get better lead indicators that could be used to shape policy direction.

He offered the example of standard of living data: “If we look at sentiment…if the structured data is telling us one thing and the sentiment is telling us something else, we could get a nice lead indicator.”
Mr Alexander said that he did not want to develop an internal search engine, or attempt to index social media himself, but instead believed that there would be services developed that could plug in via an API, such as the CSIRO’s sentiment analysis tools.

CSIRO’s Vizie service has already been trialled by the Department of Human Services to see how it can use the insights in order to provide more targeted information to people. Mr Alexander acknowledged that it was important to pay careful attention to privacy concerns when mining big data for value.

He also said that there could be value in exploring the potential role of national and international data sets in terms of their ability to inform better policy decisions.

Mr Alexander however pointed to the multiplicity of different data formats which existed even within the Australian public service, which could make inter-agency data sharing challenging. Some agencies he said even published data in PDFs.

“Most of the roadblocks are not people so much as formats,” he noted. He said that even the Australian Bureau of Statistics used a broad range of different data formats, although there was a plan to migrate to a single data standard in the future.

While the CSIRO sentiment analysis service and inter-agency data sharing may be a while off full exploitation, Mr Alexander said that Treasury had made progress with regarding to its analysis of “biggering” data.

He said that Project Odysseus (so called because it brought to an end a “ten year Greek tragedy” associated with attempts to update data analysis in the department) which was completed last year was still being rolled out across the organization.

Mr Alexander said until Odysseus was completed that Treasury; “Had been a bit stuck in the ‘90s” with analysts using Excel spreadsheets for most data analysis. “The record was one with 140,000 links to other tables,” he said.

Project Odysseus had seen Treasury data loaded into a data warehouse, checked, and then an “Excel style” front end had been provided to Treasury analysts to allow them to work on the data. He said that some reports which had taken two people a week to generate were now available at the push of a button.

Facebooktwittergoogle_plusredditpinterestlinkedinmail

Leave a Reply

Your email address will not be published.

Please Answer: *