71. Jahrestagung der Deutschen Gesellschaft für Publizistik- und Kommunikationswissenschaft
18. bis 20. März 2026 in Dortmund
Veranstaltungsprogramm
Eine Übersicht aller Sessions/Sitzungen dieser Veranstaltung.
Bitte wählen Sie einen Ort oder ein Datum aus, um nur die betreffenden Sitzungen anzuzeigen. Wählen Sie eine Sitzung aus, um zur Detailanzeige zu gelangen.
|
Sitzungsübersicht |
| Sitzung | ||
P 11: #Demokratie: Propaganda and Disinformation - Democracies Under Pressure
| ||
| Präsentationen | ||
Propaganda and Disinformation: Democracies Under Pressure The unprecedented global connectivity and technological advancements of today have resulted in various changes and challenges for democracies. Chief among them is the displacement of political discourses on social media platforms and other digital media. Although the growing use of digital media comes with opportunities for democracies, such as general increased political participation, it also leads to heightened polarization and populism (Lorenz-Spreen et al., 2023). Similarly, developments of this kind result in weakened trust in governments, politics, and media (Lorenz-Spreen et al., 2023). Democratic societies across the globe are thus confronted with mounting threats that erode the foundations of governmental institutions and democratic processes. Among these threats is the spread of disinformation, which has systematically been used as a propaganda tool by autocratic regimes to sustain stability and prevent pro-democratic forces from emerging (Del Real & Menjívar, 2024). As the spread of disinformation increases the likelihood of societal polarization in democracies, its targeted use equally presents an opportunity for autocrats to seize and maintain power (Sato & Wiebrecht, 2024). These vulnerabilities are exploited not only by domestic autocratic forces but also foreign powers trying to influence electoral outcomes or destabilize political systems, as was observed in the disinformation campaigns of Russian intelligence operatives during the 2016 US presidential election (U.S. Senate Select Committee on Intelligence, 2020). Potential hazards caused by such disinformation campaigns include the devaluation of professional journalism and scientific sources, the normalization of moral vilification, and the unjustified inclusion of fake and foreign identities in the political discourse (McKay & Tenove, 2021). In addition to these existing issues, AI-related developments (e.g., advances in and widespread access to text, image, and video generation) will impact both the quality and ease of disinformation distribution, as well as related countermeasures (López-Borrull, 2025). This impact must be investigated further in order to realistically determine the potential repercussions of rapidly evolving AI-based tools. This panel comprises four contributions that examine precisely this topic of propaganda and disinformation in the political discourses of today’s democratic societies. Each contribution presents different foci, both examining empirical data and presenting a dataset for further research in the field. The first contribution aims to support automated efforts in measuring and studying disinformation and stigma in vaccine-related discussions by providing a new, manually annotated and synthetically supplemented dataset. The second contribution reviews social media warfare by state actors and provides a conceptual framework thereof. The focus of the third contribution lies on disinformation in an electoral context and how generative AI might potentially influence its production and distribution. Finally, the fourth contribution provides a detailed comparative, mixed-methods examination of the events that transpire following a decline in trust, a decrease in financial expenditure on local media, and the emergence of news desertification as a salient issue. The consequence: designated "pink slime", a form of pseudojournalistic propaganda content that has been observed on a local level in Germany and Switzerland. Beiträge des Symposiums A Dataset for Detecting Vaccine Positions and Vaccine-Related Stigma on Social Media Vaccines are a proven method for disease eradication and control (National Health Services, 2025). Yet, vaccine hesitancy causes 1.5 million preventable deaths annually and remains a major hindrance to achieving herd immunity (World Health Organisation, 2020). Associated vaccine-related stigma risks alienating vaccine hesitant groups and further reducing global uptake (Mendonça & Hilário, 2023). Amid increasing polarisation and misinformation disseminated online, understanding and addressing vaccine-related stigma has become crucial to maximising public health outcomes (van Wees and Ström, 2024). In response to the need for nuanced analyses of the online vaccine discourse, a blended dataset of annotated English-language social media samples was designed for the testing, training, and fine-tuning of large language models to classify vaccine positions and detect related stigma. 2,174 English-language tweets were manually annotated for pro-vaccine, anti-vaccine, and vaccine hesitant positions, as well as stigmatising content. From this base dataset, we generated 2,721 synthetic paraphrases and annotated them for consistency with the original data. BERT-based models trained on the blended dataset showed improved performance in both binary and multi-class classification tasks. Ultimately, this unique resource supports ongoing research into vaccine-related stigma, enabling efforts that identify and analyse stigmatising language and ultimately promoting vaccine acceptance. Democracy Dies in Darkness: A Conceptual Exploration of Social Media Warfare Information warfare and influence operations (IWIO) have emerged as critical concerns in international security studies, particularly following high-profile incidents such as the Russian interference in the 2016 U.S. Presidential Election. This essay provides a comprehensive review of the evolving literature on IWIO, situating it within the broader field of security studies and offering a conceptual framework for understanding its key components. It distinguishes between strategic-level information warfare and the tactical deployment of influence operations, including misinformation, disinformation, malinformation, and propaganda—especially as executed through social media platforms. Through brief case analyses of X (formerly Twitter), Facebook, and TikTok, the essay explores how state actors like Russia, China, and Iran have weaponized digital platforms to disrupt democratic processes, polarize societies, and undermine institutional trust. The study concludes by identifying gaps in current research and proposing directions for future inquiry, emphasizing the need for theoretical development and robust empirical datasets to advance the study of IWIO in international relations. From Traditional to AI-Generated Disinformation: Dynamics and Implications in Electoral Contexts In recent years, scholarly attention to disinformation has grown, with a strong focus on elections in different countries (e.g., Das & Schroeder, 2021; Starbird et al., 2023; Zimmermann & Kohring, 2020). Research addresses sources and channels of disinformation (Benaissa Pedriza, 2021), drivers and consequences of belief (Vaccari et al., 2024), as well as legal (Marsden et al., 2020) and technical (Pherson et al., 2021) countermeasures. Since the rise of generative AI, scholars increasingly ask whether it exacerbates disinformation and its impact on elections. It is argued that AI-generated content differs from traditional forms, as it can be created faster, in larger quantities, in personalized ways, and with little expertise, while also being harder to detect (Feuerriegel et al., 2023). Others caution that its power is overstated, pointing to limits of mass persuasion and the importance of individual factors in voting decisions (Simon & Altay, 2025). This literature review takes a step back and (i) provides a systematic overview of forms of electoral disinformation, its distribution channels, and actors involved, (ii) analyzes how generative AI may affect production and spread, and (iii) discusses potential consequences. Pink Slime – a Potential Threat to Democracies The phenomenon of 'pink slime' is a recent one, and as a result, little research has been conducted on the subject to date (e.g., initial research by Aljebreen et al., 2024; Lepird, 2024; Shahriar, 2025). 'Pink slime' is a term used to describe news content that appears to be local but is actually poor quality, not necessarily representative of the local area and often strongly influenced by ideology (e.g., Cohen, 2015; Kennedy, 2012). It can be considered propaganda rather than legitimate news. The emergence of new outlets masquerading as local news sources poses a threat to democratic integrity. Consequently, this pioneering study for Germany and Switzerland aims to examine the phenomenon and propose a definition combining printed and online pink slime content. Furthermore, the study seeks to expand the existing body of knowledge on Swiss and German pink slime content. To this end, we conducted 11 guided interviews with experts, in addition to a content analysis of German and Swiss pink slime outlets. The hypothesis that advanced pink slime content is more prevalent in Germany than in Switzerland must be rejected, as evidence suggests that the situation may be similar in both countries but with different ideological emphases. Beyond our empirical results, at the DGPuK conference 2026 we would also like to present a series of recommendations aimed at preventing the local media sector from being overrun by pink slime content.
| ||
