from the Hans-Bredow-Institut & colleagues were chosen to present their research on "Opening the Black Box: Investigating the Algorithmization of Journalism" at the annual conference of the International Communications Association
(ICA) 2018 in Prague in the panel for excellent German communication research, organised by the German Communication Association (DGPuK)
. In order to present the German communication studies to the international community as excellent as possible, the German Communication Association (DGPuK)
selects particularly interesting research projects to present them at the ICA.
As an Association Member, the DGPuK has a free time slot at its disposal at the annual conference of the ICA, which can be filled with an own panel and is not subject to the general review process of the ICA. The aim of this timeframe is "to represent excellent communication research from the ranks of DGPuK members internationally and to document a closed topic area through substantial, new research achievements".
For the panel, DGPuK conducted a peer review process, which compared complete panel proposals from DGPuK members and used this as a basis for determining the DGPuK's programme contribution. The proposal from Julius Reimer and his colleagues has been selected and considered to be highly relevant.
The panel will bring together different perspectives on the topic in order to approach the algorithmization of journalism as holistically as possible:
- Stephan Dreyer, Amélie Pia Heldt and Wiebke Loosen propose in their conceptual and analytical contribution a typology of the different forms of algorithmization in journalism and analyse the consequences that could arise for the legal protection of journalism on the basis of it;
- The study of Julius Reimer, Folker Hanusch (University of Vienna) and Edson Tandoc, Jr. (Nanyang Technological University Singapur) looks at how algorithmically produced metrics influence the work of journalists compared to user comments and other "qualitative" feedback;
- Sascha Hölig and Lisa Merten address the audience's perspective and present their research on user preferences and fears in the light of automatically selected news;
- Carl-Gustav Lindén and Hannu Toivonen (both from the University of Helsinki) illustrate the potential of automatically generated journalism in terms of personalisation, transparency and coverage of neglected topics based on the development of a news bot for election messages.
In a response to the four presentations, the sociologist Elena Esposito (Universities of Bielefeld and Modena-Reggio Emilia) will show the connections to the algorithmization of society in general and its social consequences.
Opening the Black Box: Investigating the Algorithmization of Journalism
Panel Organizer: Julius Reimer
Chair: Julius Reimer
Respondent: Elena Esposito, University Bielefeld & University of Modena-Reggio Emilia
1. Four Forms of Algorithmized Journalism and Regulatory Implications
Stephan Dreyer, Amélie Heldt & Wiebke Loosen, all Hans Bredow Institute for Media Research, Hamburg
2. The Influence of Algorithmic vs Qualitative Audience Feedback on Journalists’ Work
Julius Reimer, Hans Bredow Institute for Media Research, Hamburg,
Folker Hanusch, University of Vienna,
Edson Tandoc Jr., Nanyang Technological University Singapore
3. User Preferences and Concerns Regarding Editor vs Algorithm Based News Selection
Sascha Hoelig & Lisa Merten, both Hans Bredow Institute for Media Research, Hamburg
4. Opening the Black Box: Demystifying News Automation
Carl-Gustav Lindén & Hannu Toivonen, both University of Helsinki
In an increasingly digitized and networked world, algorithms are progressively employed to process and interpret the vast and diverse amounts of data that are perpetually produced in various public and private domains of life. In doing so, these algorithms prepare decisions by human and institutional actors or even make them themselves, shaping social practices and reality. These developments, however, do not occur at the same speed or in similar ways in different social domains. We consider the field of journalism to be the ideal case study for expanding our understanding of algorithmization. This is because journalism is both deeply concerned with covering the consequences of algorithmization while at the same time it is itself profoundly affected by them. We can observe an escalating implementation of computational processes at all stages of (often new forms of) journalistic production and reception: from the research and use of (big) data for stories (data journalism), automatically produced content (automated journalism), audience measurement (metrics), to the distribution and (personalized) selection and presentation of news items for the audience (social media, search engines, etc.). As a consequence, algorithmization alters the foundations of public communication: the actors, practices, and norms involved and, not least, the agenda of controversial topics as well as the media content that is the basis of the social discourse about them.
To offer a holistic interpretation of news algorithmization, this panel brings together research by DGPuK-members and their international partners, combining a variety of perspectives united by their focus on the advancing algorithmization of journalism and its societal consequences. By that, it gives a prime example of the topicality, societal relevance, high quality and international compatibility of German communication research.
At the micro-level, two empirical studies look at journalists and audience members, respectively, applying actor-theoretic, repertoire-oriented approaches that account for the complexity and contingency of the modern multi-optional media environment. At the meso-level, journalism studies and computer science are combined to investigate a system for automated news production, revealing its social constructedness and demonstrating ways to make algorithms like these more transparent. These three talks are embedded in two theoretical-analytical contributions that assume macro-level perspectives: another talk that analyses how algorithmization as a socio-technological development challenges how journalism is understood by and structurally coupled with the legal system, and a response from a sociologist researching algorithmization, who will reflect on the four presentations and relate them to algorithmization’s implications for society in general.
The panel brings together research from multiple perspectives to clarify what algorithmization in journalism is and entails: a typology of its different forms and analysis of how they might challenge the legal protection of journalism, comparative research into how algorithmically produced metrics influence journalists’ work, an investigation into users’ preferences and concerns regarding algorithmically selected news, and the study of a news bot provide a holistic account of this controversial phenomenon and its societal consequences.
Four Forms of Algorithmized Journalism and Regulatory Implications
Due to journalism’s vital role for democratic societies, human rights frameworks grant thorough legal protection of journalistic activities. However, the advancing algorithmization challenges the very foundations of these jurisdictions, raising questions about what counts as journalism and ‘journalism-like’ functional equivalents provided by non-journalistic services. This interdisciplinary contribution brings together the perspectives of legal and journalism studies to identify areas in journalism where regulatory caution or even consequences might be necessary. The analysis is based on a ‘typology of four forms of datafied journalism’ (data, algorithmed, automated, and metrics-driven journalism). For each of these forms we will discuss whether normative obligations (implicitly) assume a “human in the loop” in news selection, production, curation and (personalized) distribution, where the scope of protection of media freedoms ends in the age of algorithmic news and what practical requirements result from these insights for algorithm-based systems in journalism. (143 words)
The Influence of Algorithmic vs Qualitative Audience Feedback on Journalists’ Work
Audience metrics, i.e. aggregated data on users’ behaviour on news sites and social media, are probably the most prevalent aspect of the advancing algorithmization in newsrooms and have already been found to affect journalistic work (e.g., Cherubini/Nielsen, 2016; Tandoc, 2014). Research has, however, mostly looked at metrics as an isolated factor. Approaching metrics as only one of several sources of knowledge about the audience (including circulation figures, TV/radio ratings, audience mail, user comments), we investigate how journalists integrate the ambivalent, sometimes contradicting information. The comparative findings from online surveys (n=358; n=222) and interviews (n=21; n=34) among Australian and German journalists show that metrics help form a highly differentiated audience image that guides editorial decisions. However, their impact remains limited in comparison to qualitative data such as user comments and social media interactions. Implications for journalism and potential future developments are discussed. (141 words)
User Preferences and Concerns Regarding Editor vs Algorithm-Based News Selection
People usually get news from a mix of different sources including traditional media with stories selected by journalists and social media or news aggregators where the selection is based on algorithms (Newman et al., 2016, 2017). Assuming the audience’s perspective, we present unpublished data from the Reuters Institute Digital News Survey 2016 on users’ news selection preferences in 26 countries. Findings suggest users favor combining expert selections for a general news overview with algorithmic choices that address more individual interests; users who prefer algorithmic selections display a particular sociodemographic profile. The more users prefer algorithmic news selection, the more they worry about missing important information or challenging viewpoints and have data privacy concerns. We provide a deeper understanding of the quantitative results based on data from six focus groups and eighteen interviews with German users, that offer explanations for differences in preferences, interests, concerns and algorithmic awareness. (147 words)
Inside the News Bot: Demystifying Automated Journalism
The automated production of news stories can be considered the culmination of algorithmization in journalism, with the algorithms involved often portrayed as ‘opaque’. This talk combines journalism studies and computer science to open this ‘black box’. Through a case-study – the building of a news bot that automatically generates reports about municipal elections in Finland from official data – it reveals how news automation systems are socially constructed: although the bot autonomously decides what to report and how, its selections are based on traditional news values and story-formats ‘built into’ its code. The bot showcases automation’s potential for personalization, as readers used it to provide them with highly individualized news in terms of geographical area, party and candidate of interest; its value for ‘translating’ difficult-to-access data to formats understandable for the public; and its general capability of being transparent rather than opaque, since the system allows users to retrace its decisions.