Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

 
 
Session Overview
Session
(Papers) Data II
Time:
Friday, 27/June/2025:
3:35pm - 4:50pm

Session Chair: Sage Cammers-Goodwin
Location: Auditorium 5


Show help for 'Increase or decrease the abstract text size'
Presentations

Epistemology of ignorance and datafication – To interrogate the necessity for secrecy in AI through marginalised groups’ experiences

Marilou Niedda

Utrecht University, Netherlands, The

This paper presentation seeks to articulate ignorance as an epistemic concept in the context of datafication within AI systems. To delve into this topic, my argument proceeds as follows: (i) algorithmic biases are prevalent in the application of AI technologies, where “erroneous” calculations are produced in machine learning algorithms, and result in long-lasting discriminatory outcomes; (ii) addressing these biases often involves diversifying datasets; (iii) diversification implies that additional data must be collected from marginalised populations – with women and racialised individuals at the forefront – to mitigate the (re)creation of structural inequalities.

However, I contend that this approach raises two critical issues. Firstly, datafication practices reify one’s identity, as the classificatory and rule-based nature of AI systems perpetuates a form of essentialism towards one’s experiences, which may have adverse implications for identity politics (Scott, 1991). Secondly, and this is the main argument I develop in this talk, marginalised groups had historically resisted the sharing of personal data to avoid politics of surveillance (Klein, d’Ignazio, 2020). Recent examples include African American communities in the United States facing heightened scrutiny from law enforcement with technologies of facial recognition, or the fact that individuals who menstruate stopped using period tracking apps in states where abortion became illegal after the 2022 overrule of the Roe v. Wade decision.

Drawing on the works of Linda Alcoff and Shannon Sullivan (2007), I argue that secrecy bears epistemic virtue, and articulates a two-levels epistemology of ignorance in datafication. (i) Indeed, the type of ignorance that designers of AI perpetuate is usually considered harmful as they could introduce discriminatory biases into their technological artefacts. (ii) However, ignorance allows (historically) oppressed groups to interrogate the omnipresent reality of AI systems in both one’s private/public life, by potentially refusing to share data about their experiences. I conclude that this resistance to datafication triggers reconsideration of ignorance as an epistemic concept in classic epistemology of AI, whilst allowing us to rethink the use of AI technologies altogether, and inspire community-centered approaches to data creation and usage.



Reclaiming control of thought and behavior data through the right to freedom of thought.

Kristina Pakhomchik

University of Vienna, Austria

Current data collection practices violate the fundamental human right to freedom of thought, with existing legal frameworks proving inadequate for its protection. Advances in behavioral science, psychology, and the understanding of external manifestations of thought underscore the urgent need to safeguard this right. The use of behavioral data facilitates technologies that enable manipulation, and with the scale and capabilities of AI and neurotechnologies, poses significant risks to human dignity and personal autonomy.

The paper will first examine the existing legal landscape of freedom of thought, analyzing its status across various jurisdictions and the limitations of relying on "neighboring" rights such as privacy and speech. (Shaheed, 2021) It will then explore the philosophical foundations of freedom of thought, defining its core components—freedom from interference, manipulation, and coercion—and examining the relationship between thought, speech, and the external manifestations of internal states. (McCarthy-Jones, 2019)

The core argument will demonstrate how the widespread collection of data, from interaction data to behavioral information, amounts to the collection of "thought data." Users often lack awareness of this sharing and have limited means to prevent it. Such data enables the manipulation of attention, inference of emotions, and prediction of future behaviors, undermining independent thought and decision-making. This raises critical questions about the possibility of truly informed consent for behavioral data collection. (Breen, Ouazzane, and Patel, 2020)

The paper concludes by advocating for a re-evaluation of legal frameworks to explicitly recognize and protect the absolute right to freedom of thought in the digital age. While some argue that current GDPR regulations sufficiently protect mental data under the “special categories” provision, practical application reveals significant shortcomings. (Ienca and Malgieri, 2022) Effective protection requires a paradigm shift in data collection practices, emphasizing strict limitations on certain types of data, meaningful control over behavioral personal data, and robust safeguards against interference and manipulation of individual thought processes.

This approach addresses the root cause of emotional manipulation, prioritizing user control over mental data rather than merely mitigating its consequences. The research emphasizes empowering individuals by enhancing their control over mental data and protecting their human dignity and autonomy. By tackling the potential for AI-driven manipulation, it seeks to safeguard human autonomy and preserve individuals’ capacity for self-determination, while also addressing the growing risks posed by invasive neurotechnologies increasingly integrated into daily life.

A/76/380: Interim report of the Special Rapporteur on freedom of religion or belief, Ahmed Shaheed: Freedom of thought (no date) OHCHR. Available at: https://www.ohchr.org/en/documents/thematic-reports/a76380-interim-report-special-rapporteur-freedom-religion-or-belief (Accessed: 15 January 2025).

Breen, S., Ouazzane, K. and Patel, P. (2020) ‘GDPR: Is your consent valid?’, Business Information Review, 37(1), pp. 19–24. Available at: https://doi.org/10.1177/0266382120903254.

Ienca, M. and Malgieri, G. (2022) ‘Mental data protection and the GDPR’, Journal of Law and the Biosciences, 9(1), p. lsac006. Available at: https://doi.org/10.1093/jlb/lsac006.

McCarthy-Jones, S. (2019) ‘The Autonomous Mind: The Right to Freedom of Thought in the Twenty-First Century’, Frontiers in Artificial Intelligence, 2, p. 19. Available at: https://doi.org/10.3389/frai.2019.00019.

O’Callaghan, P. and Shiner, B. (2021) ‘The Right to Freedom of Thought in the European Convention on Human Rights’, European Journal of Comparative Law and Governance, 8(2–3), pp. 112–145. Available at: https://doi.org/10.1163/22134514-bja10016.



 
Contact and Legal Notice · Contact Address:
Privacy Statement · Conference: SPT 2025
Conference Software: ConfTool Pro 2.6.154
© 2001–2025 by Dr. H. Weinreich, Hamburg, Germany