Dark Net, Panama Papers, Surveillance: Data Journalism 2016

June 29, 2016 • Digital News, Short stories, Specialist Journalism • by

8383535541_96cb837a21_z

Highlights of the Nordic Data Journalism Conference, Helsinki, 2016

Many British local journalists are dangerously ignorant of contemporary counter-surveillance methods and routines, initial findings from a Birmingham City University study suggest.

The research into the information security skills of UK journalists reveals that most interviewees (88%) were unaware of what, if anything, their employer provides in terms of information security. Yet many interviewees trust that some security was in place in their workplace, to protect them and their sources. Over one-fifth of interviewees even admitted to ignoring one of the most basic countermeasures: having different passwords for different accounts and services.

Paul Bradshaw, who founded the online journalism masters degree at Birmingham City University’s School of Media, interviewed journalists working in regional UK newspapers for the study. Many of them felt their positions were too lowly to require counter-surveillance measures. Others simply thought it was their sources’ task to look after themselves. Bradshaw, who presenting his finding to the recent Nordic Data Journalism Conference in Helsinki, said he also encountered a defeatist argument: many appeared resigned to the view that authorities are able to breach any security defenses, making resistance futile.

The Nordic Data Journalism Conference (NODA2016) began with an academic pre-conference, featuring a total of 17 presentations. Here are some highlights of the conference.

The biggest leak in history

The conference hall was packed with journalists and academics when Bastian Obermayer, a German reporter with the investigative unit of Süddeutsche Zeitung, the Munich-based newspaper, discussed the biggest data leak in history – the Panama Papers. It was the first time he talked in public about the leak.

Last year Obermayer received around 2.6 terabytes of documents from an anonymous source. The analysis of such a gigantic amount of information required a year’s work from many people, including journalists and data experts. Obermayer’s team also had to learn new skills to investigate the database. The newspaper even had to buy new high-performance computers handle the data, Obermayer explained.

Together with colleagues, Obermayer analysed the data in cooperation with the International Consortium of Investigative Journalists (ICIJ). There have been demands from several government authorities to hand over the documents but ICIJ have refused to comply, in order to protect the source. The ICIJ website reveals the source behind the Panama Papers has recently offered to make the documents available to government authorities also.

Obermayer explained the analysis and reporting based on Panama Papers is still ongoing as new topics for stories are discovered.

The dark side of the internet

Most of the world’s population live under internet censorship and access to many websites is limited. In addition to censorship, we are all subject to surveillance and information is collected from us in many ways, often with our consent. Juha Nurmi, a Fellow at Hermes, the Centre for Transparency and Digital Human Rights, introduced ways around censorship and surveillance. His conference presentation concerned dark net and Tor technology.

Dark web is often portrayed as being synonymous with illegal activities, such as drug and weapon trafficking. Nurmi advocated for the benefits that anonymity provides: privacy from internet giants gathering information; enabling data leaks such as was the case with Edward Snowden and Panama Papers; and enabling free speech in countries it is normally not possible.

There are some vulnerabilities in every computer network, but Tor networks have so far remained unhackable by governmental agencies. Nurmi also demonstrated how drug trafficking is operated via Tor networks. He has gathered statistics on drug trading via dark net and is currently producing research based on that.

Nurmi’s presentation can be found here.

Journalists need to be data scientists

Reijo Sund of the University of Helsinki encouraged journalists to think like data scientists. To be able to do this journalists need basic statistics skills and a good understanding of the nature of data.

The problem with open data (from governments, companies etc.) is that the data itself is always secondary in nature. It differs from primary data, collected by the researchers themselves, because someone else has defined the collection methods and what to publish. This said, the shape of the data cannot be controlled.

Data requires a lot of pre-processing before the actual analysis and utilization for a story. This includes several phases: cleaning, reduction and abstraction. Pre-processing is often the most time-consuming phase in a data project, Sund said.

Sund warned of hazards during data projects. There are many challenges for data journalists or any other data scientists. Effective analysis is often a problem. How to analyse massive amounts of data effectively when manual management is unfeasible? Another issue is the journalist’s own preconceptions: how to avoid fishing for things you want to find? All these thoughts are good to consider when starting data journalism projects.

In summary, the conference was a fascinating take on different aspects of data journalism, both from a practical and academic views. The next NODA conference will be organised in Odense, Denmark in 27-28 January 2017.

Full videos of all academic presentations and selected written summaries are available on the Journalism Research News website.

 

Pic credit: Mike Tegas, CC Flickr

Print Friendly, PDF & Email

Tags: , , , , , , ,

Send this to a friend