2026-05-19 –, Cineac
AI Technology is infiltrating the full tech stack of research workflows. Research is being done on AI, with AI, by AI, within existing software, through the creation of new software, on existing hardware and on new kinds of infrastructure.
Researchers are expected to and wish to act responsibly, however clear guidance is very hard to give in this very rapidly evolving landscape of AI technology.
We have developed a presentation that looks at responsible AI within the context of pushing research boundaries through the metaphor of “Free soloing”, i.e. rock climbing without the typical safety protections provided by climbing gear.
This presentation is being used within the university to facilitate open discussions on AI literacy, AI infrastructure, and specifically responsibility and risk awareness. Targeting questions such as: what are existing and emerging risks in the AI tech stack, whose responsibility is it to be aware of these risks and whose responsibility is it to establish or provide the mitigations in different kinds of research (and educational) environments?
At Surf Research Day we wish to bring some archetypical cases that have emerged from these sessions within the university. We will present these for an open discussion on responsible AI, the role of the researcher in taking responsibility and the role of service and AI infrastructure providers to take co-responsibility in providing (components of) the tech stack to do research on AI and with AI, or even by AI.
Infrastructure providers, policy makers
What is the key take away of your session?:Moving further by discussing and taking responsibility across the full AI tech stack
I'm working in the domain of research support and research infrastructure at Eindhoven University of Technology. My focus is on organising and optimising organisational and IT processes for researchers to scale up their research (data) workflows.
