EXPERTS in data journalism have spoken about the importance of the discipline at the National Council for the Training of Journalists’ annual Journalism Skills Conference being held at Harlow College.
Data journalism refers to the increasing use of numerical data in the distribution of information in the digital era and reflects the interaction between journalism and other fields such as computer science and statistics.
David Ottewell, head of data journalism for Reach believes that the skill should be something learned by all journalists, not only specialists.
He said: “Without these skills you are effectively doing journalism with one hand tied behind your back.”
Head of the Shared Data Unit at the BBC, Pete Sherlock, lamented the amount of data that is out there but remains unused.
He said: “There are thousands of data sets sitting untouched. We should be analysing these to hold the authorities to account.”
Mr Sherlock explained that there are 36,000 datasets on data.gov.uk, 93% of which have been opened less than ten times.
Leila Haddou, data journalism editor at The Times, gave an insight into how she tackles large datasets that often confront her.
She said: “The first step is to understand what I’m looking at, who collected it, and why?
“If I have a list of 10,000 names, I think who would we want to write about if they were on this list, and then look for that name.
“What I love about data journalism is it allows you to be creative.”
Chief reporter at the Stourbridge News, Bev Holder, has enjoyed the contrast between data journalism and the day-to-day rush of being a news reporter on the same patch for the past 20 years.
She said: “Data journalism is about slowing down, stepping back, and looking at what you’ve got.”
Martin Stabe, head of interactive news at the Financial Times, warned about the dangers of bending data to fit a story.
He said: “You should ask your data questions to undermine your story and to test the dataset.”