<?xml version='1.0' encoding='utf-8' ?>
<!-- Made with love by pretalx v2025.2.2. -->
<schedule>
    <generator name="pretalx" version="2025.2.2" />
    <version>0.19</version>
    <conference>
        <title>National XR Day 2025</title>
        <acronym>national-xr-day-2025</acronym>
        <start>2025-07-02</start>
        <end>2025-07-02</end>
        <days>1</days>
        <timeslot_duration>00:05</timeslot_duration>
        <base_url>https://pretalx.surf.nl</base_url>
        
        <time_zone_name>Europe/Amsterdam</time_zone_name>
        
        
        <track name="Technology" slug="92-technology"  color="#3f61ef" />
        
        <track name="Impact" slug="91-impact"  color="#da362d" />
        
        <track name="Collaboration" slug="94-collaboration"  color="#079f30" />
        
    </conference>
    <day index='1' date='2025-07-02' start='2025-07-02T04:00:00+02:00' end='2025-07-03T03:59:00+02:00'>
        <room name='Aula (Erasmus building)' guid='22d315bb-bf9b-5033-8b55-e71aebc25413'>
            <event guid='03c18591-a097-593a-8afa-af7aeb594cf4' id='2512'>
                <room>Aula (Erasmus building)</room>
                <title>Opening</title>
                <subtitle></subtitle>
                <type>Plenary presentation</type>
                <date>2025-07-02T10:00:00+02:00</date>
                <start>10:00</start>
                <duration>00:25</duration>
                <abstract>Opening by hosts Barry Fitzgerald &amp; G&#252;l Ackaova, and Erasmus University Rotterdam rector magnificus Jantine Schuit.</abstract>
                <slug>national-xr-day-2025-2512-opening</slug>
                <track></track>
                
                <persons>
                    <person id='20'>G&#252;l Akcaova</person><person id='337'>Barry W. Fitzgerald</person>
                </persons>
                <language>en</language>
                <description>Followed by an energizer, and the opening keynote by Frank Buytendijk, Distinguished Vice President and Research Fellow at Gartner.</description>
                <recording>
                    <license></license>
                    <optout>false</optout>
                </recording>
                <links></links>
                <attachments></attachments>

                <url>https://pretalx.surf.nl/national-xr-day-2025/talk/C3W33S/</url>
                <feedback_url>https://pretalx.surf.nl/national-xr-day-2025/talk/C3W33S/feedback/</feedback_url>
            </event>
            <event guid='fbdcc2f0-7186-5d91-8286-599c8653e3ab' id='2514'>
                <room>Aula (Erasmus building)</room>
                <title>Gartner Futures Lab: Immersive Experiences in the Digital Society</title>
                <subtitle></subtitle>
                <type>Plenary presentation</type>
                <date>2025-07-02T10:25:00+02:00</date>
                <start>10:25</start>
                <duration>00:30</duration>
                <abstract>Every 5 years Gartner publishes a set of global scenarios. What will the world at large look like in 2040? It feels we are at a crossroads of what we want our society to look like, empowering individuals or focusing on shared experiences. But &#8212; more importantly &#8212; how can we achieve both?</abstract>
                <slug>national-xr-day-2025-2514-gartner-futures-lab-immersive-experiences-in-the-digital-society</slug>
                <track>Impact</track>
                
                <persons>
                    <person id='1081'>Frank Buytendijk</person>
                </persons>
                <language>en</language>
                <description>Frank is a Distinguished Vice President and Research Fellow at research firm Gartner, and the Chief of Research for Gartner Futures Lab.</description>
                <recording>
                    <license></license>
                    <optout>false</optout>
                </recording>
                <links></links>
                <attachments></attachments>

                <url>https://pretalx.surf.nl/national-xr-day-2025/talk/L7J9XQ/</url>
                <feedback_url>https://pretalx.surf.nl/national-xr-day-2025/talk/L7J9XQ/feedback/</feedback_url>
            </event>
            
        </room>
        <room name='Demo area (Theil building hallway)' guid='78fc3213-fc13-5079-a488-8a07a89a22da'>
            <event guid='9aa2b8dd-795c-54c1-ad02-47ea878fc71c' id='2527'>
                <room>Demo area (Theil building hallway)</room>
                <title>Demonstrations (continued during lunch)</title>
                <subtitle></subtitle>
                <type>Demonstration</type>
                <date>2025-07-02T11:00:00+02:00</date>
                <start>11:00</start>
                <duration>01:15</duration>
                <abstract>A selection of XR demonstrations in the central demo area of the Theil building. Click on the name of a demo for more information.</abstract>
                <slug>national-xr-day-2025-2527-demonstrations-continued-during-lunch</slug>
                <track></track>
                
                <persons>
                    
                </persons>
                <language>en</language>
                <description>**[De virtuele wereld: Cyberpunk of Utopie?](https://pretalx.surf.nl/national-xr-day-2025/talk/HSHFUC/)**

Experience the virtual city of Rotterdam in VR! In this demo, the XR-lab of Rotterdam University of Applied Sciences provides a suggestive representation of the virtual city of the future, in which the user takes part in conversations with (AI-driven) residents. The XR-lab of CMI (Institute for Communication, Media and Information Technology) developed this demo in collaboration with the City of Rotterdam and uses it as a template for various experiments and applications. Take part and have yourself 3D scanned and take a walk through the virtual city of the future. What opportunities does AI offer in XR and how do we use this responsibly? How do you think technology should work and what possible applications do you think of?

&#127475;&#127473; Ervaar de virtuele stad Rotterdam in VR! In deze demo geeft het XR-lab van Hogeschool Rotterdam een suggestieve weergave van de virtuele stad van de toekomst, waarin de gebruiker het gesprek aan gaat met (AI-gedreven) inwoners. Het XR-lab van CMI (Instituut voor Communicatie, Media en Informatietechnologie) heeft deze demo ontwikkeld in samenwerking met Gemeente Rotterdam en zet het in als template voor diverse experimenten en toepassingen. Neem zelf deel en laat je 3D scannen en loop zelf door de virtuele stad van de toekomst. Welke kansen biedt AI in XR en hoe zetten we dit verantwoord in? Hoe moet technologie volgens jou werken en aan welke mogelijke toepassingen denk jij?


**[ECDA - Digiderius and his shadow](https://pretalx.surf.nl/national-xr-day-2025/talk/Y8Q3TX/)**

Virtual humans are growing in popularity, becoming more lifelike, convincing and easier to interact with. They can be seen as a convenient way to get an answer to a specific question or an explanation about a complex topic, acting as a tutor, guide or friendly face to talk to. However, their ingenuity and convenience often makes us forget a very important question; at what cost?


**[Erasmus MC XR Experience](https://pretalx.surf.nl/national-xr-day-2025/talk/KF9KBV/)**

VR applications for radiology and operating room


**[Grafisch Lyceum](https://pretalx.surf.nl/national-xr-day-2025/talk/EZJQAT/)**

&#127475;&#127473; Ge&#239;nspireerd door Erik of het klein insectenboek kun je je laten onderdompelen in de Rotterdamse stadsnatuur. Data verzameld door (amateur)biologen wordt gevisualiseerd in deze immersieve installatie. Stap in het schilderij dat natuur heet en zie waar normaal geen oog voor is.

Een collaboratie van ECDA en het GLR.


**[Interactive Annotation and Automatic Feature Boundary Segmentation of Point Clouds in VR](https://pretalx.surf.nl/national-xr-day-2025/talk/8XKXGF/)**

Using automatic boundary detection and segmentation in point cloud data to support user-driven annotations in interactive virtual reality to streamline the annotation process and enhance precision, efficiency, and intuitiveness in 3D data interpretation and interaction.

Point clouds provide a way to visualize 3D data by combining high detail and scalability, which is amplified when viewed in virtual reality (VR), where the user gains spatial awareness and can perceive depth more accurately. Being able to detect spatial relationships, as well as navigate through them, can provide the user with a better and immersive insight into their data and give them new perspectives.

Recognizing these advantages, industries increasingly use point clouds to digitize real-world spaces and objects, frequently requiring manual (or semi-manual) annotations for training machine learning models. This project aims to develop a VR-based system to support and assist user-driven annotations of large point clouds in VR. By detecting and segmenting features in the cloud, the system seeks to streamline the annotation process, thus enhancing efficiency, precision, and consistency. 

Additionally, to add to the intuitiveness and ease of use, the project leverages the potential of VR to make use of optional gesture- and voice-based commands to traverse and interact with the point cloud in an accessible and user-friendly manner.


**[Lumi: AI-gedreven Augmented Reality software voor de Chirurgie van Vandaag en Morgen](https://pretalx.surf.nl/national-xr-day-2025/talk/S7XEFF/)**

&#127475;&#127473; Ontdek hoe Lumi de chirurgie innoveert. Tijdens een demo beleef je hoe onze medische software 2D-beelden omzet in intu&#239;tieve 3D-hologrammen met behulp van een krachtige set Artificial Intelligence algoritmes die anatomische structuren automatisch herkennen. De 3D-hologrammen worden direct in de echte wereld geprojecteerd, voor hogere precisie, minder vermijdbare fouten en aantoonbaar verbeterde chirurgische uitkomsten.

Lumi wordt gebruikt voor de voorbereiding van operaties, training van chirurgen in opleiding, pati&#235;ntenvoorlichting, &#233;n, in de nieuwste versie, live ondersteuning tijdens de operatie met Augmented Reality in de OK.


**[Make-Your-Own-Street: procedural tool for fast generation of participatory VR in education &amp; research](https://pretalx.surf.nl/national-xr-day-2025/talk/FLTBHB/)**

MYOS Make-Your-Own-Street is a user-friendly, UE-based tool developed at TU Eindhoven&#8217;s UBeX lab for fast procedural generation of customizable 3D urban streetscapes. Built on a growing library of standardized Dutch urban elements, MYOS enables researchers and students to explore citizen preferences and urban scenarios in VR without manual modeling or programming, promoting participatory design.

Researchers and students in Urban Planning and Design can benefit greatly from the ability to quickly generate and modify 3D digital VR-environments that replicate real-world urban streetscapes. These environments are e.g. useful to assess the impact of various urban elements on the residents&apos; behaviour, wellbeing and perceptions.

However, creating and adjusting the environments manually is a time-consuming process. Even simple modifications, like changing the width of the road or altering the type of buildings, become tedious when they require manually repositioning numerous other elements. At UBeX Urban Behaviour eXtended reality lab of TU Eindhoven, we offer a solution. Our UE-based model, Make-Your-Own-Street (MYOS) allows for the procedural generation of realistic and customizable urban streetscapes tailored to specific study or research objective. 

MYOS is built around a growing library of standardized 3D components that represent Dutch urban elements - such as roads, sidewalks, street furniture, vegetation, and buildings. These components, along with the procedural code behind them, allow users to easily replicate real-world scenarios or experiment with hypothetical urban settings. Through a simple interface, users can select elements to include in the desired environment, while backend procedurals automatically assemble the streetscape. MYOS requires no programming knowledge to use, and can be extended with minimal coding effort.

At TU Eindhoven, MYOS serves as a participatory design tool for students and researchers in urban systems and real estate. MYOS enables these users to actively explore and address citizen preferences across a wide variety of applications - without the intensive effort of designing and modelling an urban environment from scratch. During the demonstration, we will showcase MYOS in educational settings and allow visitors to interact with the tool by creating and adjusting their own streetscapes.


**[New Media centre - XRZone - TU Delft](https://pretalx.surf.nl/national-xr-day-2025/talk/VXJ8NF/)**

Demo booth with various demo&#8217;s of the New Media Centre&#8217;s XRZone of the TU Delft. We will show what we do and which project we are working on. Some of the main demo&#8217;s are: XRScale kit library, XR tutorial level, AI classroom(fontus/HAN) and more.

Demo booth with various demo&#8217;s of the New Media Centre&#8217;s XRZone of the TU Delft. We will show what we do and which project we are working on. Including: 

- XRScale kit - library
- XR Tutorial level
- AI Classroom Fontys/HAN developed by TUD
- Geoscience safety protocol
- and more&#8230;


**[Npuls stand](https://pretalx.surf.nl/national-xr-day-2025/talk/KMFKFM/)**

&#127475;&#127473; Npuls stimuleert samenwerking tussen onderwijsinstellingen om samen te onderzoeken hoe XR-technologie waardevol kan zijn voor het onderwijs. We verkennen en realiseren de randvoorwaarden om XR goed in te zetten voor doorlopend het beste onderwijs.

Een van de projecten is het XR-framework en de bijbehorende toolbox: praktische tools voor wie met XR aan de slag wil, maar niet weet waar te beginnen. Denk aan stappenplannen, checklists en formats. Zo maken we XR concreet en haalbaar. Kom langs en ontdek hoe jij XR kunt inzetten.


**[Prototyping with XR &amp; gaming technologies](https://pretalx.surf.nl/national-xr-day-2025/talk/UH7PGA/)**

The Saxion XR Lab will present a variety of prototypes, including a sandbox VR operating room designed for anesthesiology training, a immersive VR experience that simulates the feeling of burnout, and an innovative VR Gaussian Splatting prototype.


**[The Human Pour: Human(e) Interaction and Making](https://pretalx.surf.nl/national-xr-day-2025/talk/HSHFUC/)**

Step behind the bar in our interactive VR game. Mix drinks and serve them to the right guests&#8212;easy, right? Or maybe not&#8230; Beneath the surface lies an experience that quietly gets you thinking about who&#8212;and what&#8212;we consider &#8220;human.&#8221; Representation in VR goes far beyond visual likeness. It reflects what we value as a society and how we treat other persons humanely. Whether everyone feels welcome and recognized in digital worlds matters more than you might think.

Stap achter de bar in onze interactieve VR-game. Serveer drankjes aan de gasten &#8211; makkelijk, toch? Of misschien toch niet&#8230; Onder de oppervlakte schuilt een ervaring die je onbewust laat nadenken over wie &#8211; en wat &#8211; we als menselijk beschouwen. Representatie in VR gaat verder dan hoe iets eruitziet. Het draait om hoe we met anderen omgaan. Hoe mensen worden weergegeven in digitale werelden doet ertoe &#8211; het be&#239;nvloedt hoe we onszelf en anderen benaderen. Als we daar fouten in maken of bepaalde vooroordelen meenemen, kunnen we onbewust stereotypen in stand houden, mensen buitensluiten, of juist alles op elkaar laten lijken.


**[Try a Device! - SURF XR Developer Network](https://pretalx.surf.nl/national-xr-day-2025/talk/TVDDKB/)**

Organized by the SURF XR Developer Network this session provides visitors to National XR Day the possibility to try and experience a variety of XR devices.

Note: this session is only available during the lunch slot (1215 - 1345)

We will have present:

* Varjo XR-4: a high-resolution wired VR and MR headset
* Lynx R-1: designed in France, the Lynx R-1 is one of the few standalone VR/MR headsets that is built as an open platform, and is not based on a data-selling business model. It is somewhat older by now (design from 2021), but a Lynx R-2 based on AndroidXR is being developed.
* Play For Dream MR: a standalone XR headset from China with some impressive specs
* Magic Leap 2: Mixed Reality glasses, in features very similar to HoloLens 2
* Snap Spectacles: standalone AR glasses from Snap, which allow for all kinds of so-called &quot;Lenses&quot; (Snap&apos;s name for AR apps and experiences)
* Even Realities G1: standalone smartglasses with built-in single-color display, with AI features such as real-time automatic translation of audio

(Many thanks to of our contacts from the SURF community for providing some of the devices)


**[eXtended Realities - Intraverse Toolkit (XR-IT) - live demo of high-fidelity Networked XR](https://pretalx.surf.nl/national-xr-day-2025/talk/ZW88EB/)**

Live networked demonstration of XR-IT (the eXtended Realities - Intraverse Toolkit) showing how multiple physical locations can easily collaborate in high-fidelity networked XR scenarios, including the use of motion capture, games engines, HMDs, and low latency audio/video streaming across campuses.

XR-IT enables remote engagement in complex XR environments through the use of a distributed computing model that supports local instancing of games engines (Unreal, and Unity), and low-latency geographic distribution of mocap data, audio/video streams, camera parameters and facial capture data, using a secure global virtual LAN model of connectivity. Beyond the issue of connecting distributed production facilities, is the challenge of coalescing physical and virtual spaces across multiple nodes of engagement into a seamless hybrid environment where all partners can work with common spatial references and orientations. XR-IT supports such scenarios through the provision of automated tools enabling spatial coalescence within new plugins for games engines.

XR-IT was developed in the Trans Realties Lab (TRL) at Design Academy Eindhoven (DAE) with financial support of the European Union&#8217;s European Media and Immersion Lab programme and Horizon Europe.

https://transrealitieslab.com/</description>
                <recording>
                    <license></license>
                    <optout>false</optout>
                </recording>
                <links></links>
                <attachments></attachments>

                <url>https://pretalx.surf.nl/national-xr-day-2025/talk/FBK9JH/</url>
                <feedback_url>https://pretalx.surf.nl/national-xr-day-2025/talk/FBK9JH/feedback/</feedback_url>
            </event>
            <event guid='b602869f-c18e-51a7-ab70-bf9f28235ffd' id='4383'>
                <room>Demo area (Theil building hallway)</room>
                <title>Demonstrations (last chance)</title>
                <subtitle></subtitle>
                <type>Demonstration</type>
                <date>2025-07-02T13:45:00+02:00</date>
                <start>13:45</start>
                <duration>00:15</duration>
                <abstract>The final 15 minutes in which you can try a demo</abstract>
                <slug>national-xr-day-2025-4383-demonstrations-last-chance</slug>
                <track></track>
                
                <persons>
                    
                </persons>
                <language>en</language>
                <description>**[De virtuele wereld: Cyberpunk of Utopie?](https://pretalx.surf.nl/national-xr-day-2025/talk/HSHFUC/)**

Experience the virtual city of Rotterdam in VR! In this demo, the XR-lab of Rotterdam University of Applied Sciences provides a suggestive representation of the virtual city of the future, in which the user takes part in conversations with (AI-driven) residents. The XR-lab of CMI (Institute for Communication, Media and Information Technology) developed this demo in collaboration with the City of Rotterdam and uses it as a template for various experiments and applications. Take part and have yourself 3D scanned and take a walk through the virtual city of the future. What opportunities does AI offer in XR and how do we use this responsibly? How do you think technology should work and what possible applications do you think of?

&#127475;&#127473; Ervaar de virtuele stad Rotterdam in VR! In deze demo geeft het XR-lab van Hogeschool Rotterdam een suggestieve weergave van de virtuele stad van de toekomst, waarin de gebruiker het gesprek aan gaat met (AI-gedreven) inwoners. Het XR-lab van CMI (Instituut voor Communicatie, Media en Informatietechnologie) heeft deze demo ontwikkeld in samenwerking met Gemeente Rotterdam en zet het in als template voor diverse experimenten en toepassingen. Neem zelf deel en laat je 3D scannen en loop zelf door de virtuele stad van de toekomst. Welke kansen biedt AI in XR en hoe zetten we dit verantwoord in? Hoe moet technologie volgens jou werken en aan welke mogelijke toepassingen denk jij?


**[ECDA - Digiderius and his shadow](https://pretalx.surf.nl/national-xr-day-2025/talk/Y8Q3TX/)**

Virtual humans are growing in popularity, becoming more lifelike, convincing and easier to interact with. They can be seen as a convenient way to get an answer to a specific question or an explanation about a complex topic, acting as a tutor, guide or friendly face to talk to. However, their ingenuity and convenience often makes us forget a very important question; at what cost?


**[Erasmus MC XR Experience](https://pretalx.surf.nl/national-xr-day-2025/talk/KF9KBV/)**

VR applications for radiology and operating room


**[Grafisch Lyceum](https://pretalx.surf.nl/national-xr-day-2025/talk/EZJQAT/)**

&#127475;&#127473; Ge&#239;nspireerd door Erik of het klein insectenboek kun je je laten onderdompelen in de Rotterdamse stadsnatuur. Data verzameld door (amateur)biologen wordt gevisualiseerd in deze immersieve installatie. Stap in het schilderij dat natuur heet en zie waar normaal geen oog voor is.

Een collaboratie van ECDA en het GLR.


**[Interactive Annotation and Automatic Feature Boundary Segmentation of Point Clouds in VR](https://pretalx.surf.nl/national-xr-day-2025/talk/8XKXGF/)**

Using automatic boundary detection and segmentation in point cloud data to support user-driven annotations in interactive virtual reality to streamline the annotation process and enhance precision, efficiency, and intuitiveness in 3D data interpretation and interaction.

Point clouds provide a way to visualize 3D data by combining high detail and scalability, which is amplified when viewed in virtual reality (VR), where the user gains spatial awareness and can perceive depth more accurately. Being able to detect spatial relationships, as well as navigate through them, can provide the user with a better and immersive insight into their data and give them new perspectives.

Recognizing these advantages, industries increasingly use point clouds to digitize real-world spaces and objects, frequently requiring manual (or semi-manual) annotations for training machine learning models. This project aims to develop a VR-based system to support and assist user-driven annotations of large point clouds in VR. By detecting and segmenting features in the cloud, the system seeks to streamline the annotation process, thus enhancing efficiency, precision, and consistency. 

Additionally, to add to the intuitiveness and ease of use, the project leverages the potential of VR to make use of optional gesture- and voice-based commands to traverse and interact with the point cloud in an accessible and user-friendly manner.


**[Lumi: AI-gedreven Augmented Reality software voor de Chirurgie van Vandaag en Morgen](https://pretalx.surf.nl/national-xr-day-2025/talk/S7XEFF/)**

&#127475;&#127473; Ontdek hoe Lumi de chirurgie innoveert. Tijdens een demo beleef je hoe onze medische software 2D-beelden omzet in intu&#239;tieve 3D-hologrammen met behulp van een krachtige set Artificial Intelligence algoritmes die anatomische structuren automatisch herkennen. De 3D-hologrammen worden direct in de echte wereld geprojecteerd, voor hogere precisie, minder vermijdbare fouten en aantoonbaar verbeterde chirurgische uitkomsten.

Lumi wordt gebruikt voor de voorbereiding van operaties, training van chirurgen in opleiding, pati&#235;ntenvoorlichting, &#233;n, in de nieuwste versie, live ondersteuning tijdens de operatie met Augmented Reality in de OK.


**[Make-Your-Own-Street: procedural tool for fast generation of participatory VR in education &amp; research](https://pretalx.surf.nl/national-xr-day-2025/talk/FLTBHB/)**

MYOS Make-Your-Own-Street is a user-friendly, UE-based tool developed at TU Eindhoven&#8217;s UBeX lab for fast procedural generation of customizable 3D urban streetscapes. Built on a growing library of standardized Dutch urban elements, MYOS enables researchers and students to explore citizen preferences and urban scenarios in VR without manual modeling or programming, promoting participatory design.

Researchers and students in Urban Planning and Design can benefit greatly from the ability to quickly generate and modify 3D digital VR-environments that replicate real-world urban streetscapes. These environments are e.g. useful to assess the impact of various urban elements on the residents&apos; behaviour, wellbeing and perceptions.

However, creating and adjusting the environments manually is a time-consuming process. Even simple modifications, like changing the width of the road or altering the type of buildings, become tedious when they require manually repositioning numerous other elements. At UBeX Urban Behaviour eXtended reality lab of TU Eindhoven, we offer a solution. Our UE-based model, Make-Your-Own-Street (MYOS) allows for the procedural generation of realistic and customizable urban streetscapes tailored to specific study or research objective. 

MYOS is built around a growing library of standardized 3D components that represent Dutch urban elements - such as roads, sidewalks, street furniture, vegetation, and buildings. These components, along with the procedural code behind them, allow users to easily replicate real-world scenarios or experiment with hypothetical urban settings. Through a simple interface, users can select elements to include in the desired environment, while backend procedurals automatically assemble the streetscape. MYOS requires no programming knowledge to use, and can be extended with minimal coding effort.

At TU Eindhoven, MYOS serves as a participatory design tool for students and researchers in urban systems and real estate. MYOS enables these users to actively explore and address citizen preferences across a wide variety of applications - without the intensive effort of designing and modelling an urban environment from scratch. During the demonstration, we will showcase MYOS in educational settings and allow visitors to interact with the tool by creating and adjusting their own streetscapes.


**[New Media centre - XRZone - TU Delft](https://pretalx.surf.nl/national-xr-day-2025/talk/VXJ8NF/)**

Demo booth with various demo&#8217;s of the New Media Centre&#8217;s XRZone of the TU Delft. We will show what we do and which project we are working on. Some of the main demo&#8217;s are: XRScale kit library, XR tutorial level, AI classroom(fontus/HAN) and more.

Demo booth with various demo&#8217;s of the New Media Centre&#8217;s XRZone of the TU Delft. We will show what we do and which project we are working on. Including: 

- XRScale kit - library
- XR Tutorial level
- AI Classroom Fontys/HAN developed by TUD
- Geoscience safety protocol
- and more&#8230;


**[Npuls stand](https://pretalx.surf.nl/national-xr-day-2025/talk/KMFKFM/)**

&#127475;&#127473; Npuls stimuleert samenwerking tussen onderwijsinstellingen om samen te onderzoeken hoe XR-technologie waardevol kan zijn voor het onderwijs. We verkennen en realiseren de randvoorwaarden om XR goed in te zetten voor doorlopend het beste onderwijs.

Een van de projecten is het XR-framework en de bijbehorende toolbox: praktische tools voor wie met XR aan de slag wil, maar niet weet waar te beginnen. Denk aan stappenplannen, checklists en formats. Zo maken we XR concreet en haalbaar. Kom langs en ontdek hoe jij XR kunt inzetten.


**[Prototyping with XR &amp; gaming technologies](https://pretalx.surf.nl/national-xr-day-2025/talk/UH7PGA/)**

The Saxion XR Lab will present a variety of prototypes, including a sandbox VR operating room designed for anesthesiology training, a immersive VR experience that simulates the feeling of burnout, and an innovative VR Gaussian Splatting prototype.


**[The Human Pour: Human(e) Interaction and Making](https://pretalx.surf.nl/national-xr-day-2025/talk/HSHFUC/)**

Demo 1: The Human Pour: Human(e) Interaction and Making

Step behind the bar in our interactive VR game. Mix drinks and serve them to the right guests&#8212;easy, right? Or maybe not&#8230; Beneath the surface lies an experience that quietly gets you thinking about who&#8212;and what&#8212;we consider &#8220;human.&#8221; Representation in VR goes far beyond visual likeness. It reflects what we value as a society and how we treat other persons humanely. Whether everyone feels welcome and recognized in digital worlds matters more than you might think.

Stap achter de bar in onze interactieve VR-game. Serveer drankjes aan de gasten &#8211; makkelijk, toch? Of misschien toch niet&#8230; Onder de oppervlakte schuilt een ervaring die je onbewust laat nadenken over wie &#8211; en wat &#8211; we als menselijk beschouwen. Representatie in VR gaat verder dan hoe iets eruitziet. Het draait om hoe we met anderen omgaan. Hoe mensen worden weergegeven in digitale werelden doet ertoe &#8211; het be&#239;nvloedt hoe we onszelf en anderen benaderen. Als we daar fouten in maken of bepaalde vooroordelen meenemen, kunnen we onbewust stereotypen in stand houden, mensen buitensluiten, of juist alles op elkaar laten lijken.


**[Try a Device! - SURF XR Developer Network](https://pretalx.surf.nl/national-xr-day-2025/talk/TVDDKB/)**

Organized by the SURF XR Developer Network this session provides visitors to National XR Day the possibility to try and experience a variety of XR devices.

Note: this session is only available during the lunch slot (1215 - 1345)

We will have present:

* Varjo XR-4: a high-resolution wired VR and MR headset
* Lynx R-1: designed in France, the Lynx R-1 is one of the few standalone VR/MR headsets that is built as an open platform, and is not based on a data-selling business model. It is somewhat older by now (design from 2021), but a Lynx R-2 based on AndroidXR is being developed.
* Play For Dream MR: a standalone XR headset from China with some impressive specs
* Magic Leap 2: Mixed Reality glasses, in features very similar to HoloLens 2
* Snap Spectacles: standalone AR glasses from Snap, which allow for all kinds of so-called &quot;Lenses&quot; (Snap&apos;s name for AR apps and experiences)
* Even Realities G1: standalone smartglasses with built-in single-color display, with AI features such as real-time automatic translation of audio

(Many thanks to of our contacts from the SURF community for providing some of the devices)


**[eXtended Realities - Intraverse Toolkit (XR-IT) - live demo of high-fidelity Networked XR](https://pretalx.surf.nl/national-xr-day-2025/talk/ZW88EB/)**

Live networked demonstration of XR-IT (the eXtended Realities - Intraverse Toolkit) showing how multiple physical locations can easily collaborate in high-fidelity networked XR scenarios, including the use of motion capture, games engines, HMDs, and low latency audio/video streaming across campuses.

XR-IT enables remote engagement in complex XR environments through the use of a distributed computing model that supports local instancing of games engines (Unreal, and Unity), and low-latency geographic distribution of mocap data, audio/video streams, camera parameters and facial capture data, using a secure global virtual LAN model of connectivity. Beyond the issue of connecting distributed production facilities, is the challenge of coalescing physical and virtual spaces across multiple nodes of engagement into a seamless hybrid environment where all partners can work with common spatial references and orientations. XR-IT supports such scenarios through the provision of automated tools enabling spatial coalescence within new plugins for games engines.

XR-IT was developed in the Trans Realties Lab (TRL) at Design Academy Eindhoven (DAE) with financial support of the European Union&#8217;s European Media and Immersion Lab programme and Horizon Europe.

https://transrealitieslab.com/</description>
                <recording>
                    <license></license>
                    <optout>false</optout>
                </recording>
                <links></links>
                <attachments></attachments>

                <url>https://pretalx.surf.nl/national-xr-day-2025/talk/HVWG9S/</url>
                <feedback_url>https://pretalx.surf.nl/national-xr-day-2025/talk/HVWG9S/feedback/</feedback_url>
            </event>
            <event guid='165bcfc4-f494-5c7e-a159-d90b47129de2' id='2526'>
                <room>Demo area (Theil building hallway)</room>
                <title>Introduction Anouk &amp; Rafaele</title>
                <subtitle></subtitle>
                <type>Plenary presentation</type>
                <date>2025-07-02T15:45:00+02:00</date>
                <start>15:45</start>
                <duration>00:05</duration>
                <abstract>Our hosts Barry &amp; Gul introduce the closing act.</abstract>
                <slug>national-xr-day-2025-2526-introduction-anouk-rafaele</slug>
                <track></track>
                
                <persons>
                    <person id='337'>Barry W. Fitzgerald</person><person id='20'>G&#252;l Akcaova</person>
                </persons>
                <language>en</language>
                <description>Please all join in the main Theil hallway to enjoy the show!</description>
                <recording>
                    <license></license>
                    <optout>false</optout>
                </recording>
                <links></links>
                <attachments></attachments>

                <url>https://pretalx.surf.nl/national-xr-day-2025/talk/PWEFVL/</url>
                <feedback_url>https://pretalx.surf.nl/national-xr-day-2025/talk/PWEFVL/feedback/</feedback_url>
            </event>
            <event guid='e1ed8ffc-59cb-5c3f-8dc8-691a71764520' id='2513'>
                <room>Demo area (Theil building hallway)</room>
                <title>Fashion show</title>
                <subtitle></subtitle>
                <type>Plenary presentation</type>
                <date>2025-07-02T15:50:00+02:00</date>
                <start>15:50</start>
                <duration>00:30</duration>
                <abstract>An act on technology and interaction</abstract>
                <slug>national-xr-day-2025-2513-fashion-show</slug>
                <track>Technology</track>
                
                <persons>
                    
                </persons>
                <language>en</language>
                <description>(more details to be provided)</description>
                <recording>
                    <license></license>
                    <optout>false</optout>
                </recording>
                <links></links>
                <attachments></attachments>

                <url>https://pretalx.surf.nl/national-xr-day-2025/talk/KYSG7V/</url>
                <feedback_url>https://pretalx.surf.nl/national-xr-day-2025/talk/KYSG7V/feedback/</feedback_url>
            </event>
            
        </room>
        <room name='Theil CB-2 (118p)' guid='8fb75cf1-6203-583b-b248-2318587005d2'>
            <event guid='6a4a2c9a-a8a8-50d6-ac51-140f69b7b828' id='2502'>
                <room>Theil CB-2 (118p)</room>
                <title>The Augmented Gaze: data, privacy and the right to alter reality (panel discussion)</title>
                <subtitle></subtitle>
                <type>Panel discussion</type>
                <date>2025-07-02T11:15:00+02:00</date>
                <start>11:15</start>
                <duration>00:45</duration>
                <abstract>What happens to our data and privacy if AI-powered XR glasses become everyday wear? What happens to our perception of ourselves and others if we can each customise and augment our reality &#8212;or be augmented by others?

Our panel of lawyers, policy researchers and industry practitioners will examine the data infrastructures of global-scale spatial computing and ask: who will own YOUR reality?</abstract>
                <slug>national-xr-day-2025-2502-the-augmented-gaze-data-privacy-and-the-right-to-alter-reality-panel-discussion</slug>
                <track>Impact</track>
                
                <persons>
                    <person id='1015'>Alina Kadlubsky</person><person id='1040'>Rob Morgan</person><person id='1041'>Kelsey Farish</person><person id='1076'>Kelsey Farish</person>
                </persons>
                <language>en</language>
                <description>A recent controversy over a &#8220;chubby filter&#8221; on TikTok sparked calls for a ban. But what if other people could apply the &#8220;chubby&#8221; filter to YOUR body, live and in real-time, as they passed you on the street? What if they could do so in the privacy of their own glasses, without your knowledge or permission?

The world&#8217;s most powerful companies are investing billions into physical-digital infrastructures built on scans of the real world. Companies like Niantic are already training world-mapping AIs on datasets contributed by millions of users. And soon, all-day-wear reality-augmenting glasses may enable consumers to view the world through AI-powered filters and reskins. In the words of Niantic&#8217;s CEO, consumers might be able to &#8220;theme the world like it&#8217;s Nintendo everywhere.&#8221; 

But unlike the protocols underpinning the Internet, the infrastructure for global-scale spatial computing will likely be proprietary and profit-driven - and it is already being built.

Moderator: Barry Fitzgerald (BW Science)

Companies who supply the technology to augment reality are likely to provide it cheaply or freely in exchange for user data - data about users&#8217; biometrics, data about the world around them, and maybe even data about any bystanders nearby. Will these spatial computing platforms have any incentive to regulate their technology, or to protect the privacy, safety and identity rights of users and bystanders?

In this context, our panel will debate the power dynamic between individuals and businesses, and discuss rights - not only regarding datasets themselves, but rights over how the world around us is graphically augmented, and whether we can effectively control how others augment us.

Bringing perspectives from industry, policy and law from across Europe, our panel will examine the sociocultural impact of the current geospatial data &#8220;land grab&#8221; and the XR+AI technologies it will power. Drawing on the work of XR4Human and EU initiatives, we&#8217;ll provide concrete calls-to-action for attendees to help shape European policy.</description>
                <recording>
                    <license></license>
                    <optout>false</optout>
                </recording>
                <links></links>
                <attachments></attachments>

                <url>https://pretalx.surf.nl/national-xr-day-2025/talk/ERTLAW/</url>
                <feedback_url>https://pretalx.surf.nl/national-xr-day-2025/talk/ERTLAW/feedback/</feedback_url>
            </event>
            <event guid='bb0ac067-43e0-59d5-b2a1-5e16d9636432' id='2460'>
                <room>Theil CB-2 (118p)</room>
                <title>&#127475;&#127473; Wanneer is VR (g)een goede leeromgeving? (panel discussion)</title>
                <subtitle></subtitle>
                <type>Panel discussion</type>
                <date>2025-07-02T13:45:00+02:00</date>
                <start>13:45</start>
                <duration>00:45</duration>
                <abstract>Virtual Reality biedt ongekende mogelijkheden, maar hoe zorg je ervoor dat VR het leren van jouw studenten daadwerkelijk verrijkt en versterk? Wat moet je vooral niet doen? Deze vragen worden steeds relevanter nu meer VR-content open gedeeld wordt. In deze paneldiscussie gaan ervaren onderwijsinnovators in gesprek over de impact, uitdagingen &#233;n grenzen van VR als didactisch leermiddel.</abstract>
                <slug>national-xr-day-2025-2460-wanneer-is-vr-g-een-goede-leeromgeving-panel-discussion</slug>
                <track>Impact</track>
                
                <persons>
                    <person id='1017'>Erik-Jan Smits</person><person id='1087'>Gijs Terlouw</person><person id='203'>Jolien Mouw</person><person id='457'>Roel Peijs</person>
                </persons>
                <language>en</language>
                <description>Virtual Reality biedt ongekende mogelijkheden voor het vervolgonderwijs, maar hoe zorg je ervoor dat VR het leren van studenten echt verrijkt? Wanneer draagt VR &#233;cht bij aan effectief onderwijs? Hoe kan VR een krachtige, maar veilige leeromgeving bieden? Wat zijn de grenzen van VR als didactisch leermiddel? Wat moet je vooral niet doen? Deze vragen worden de komende jaren nog relevanter nu steeds meer VR-content open gedeeld wordt.

In deze interactieve paneldiscussie gaan ervaren onderwijsinnovators uit verschillende onderwijssectoren (mbo, hbo en wo) en onderwijsrichtingen (zorg, onderwijs) in gesprek over de kansen, uitdagingen, &#233;n grenzen van VR als didactisch leermiddel. Vanuit hun rollen als docent, onderwijsonderzoeker, lector, innovator, blended coach en adviseur hebben zij VR met succes ge&#239;mplementeerd in hun instellingen. Twee projecten zelfs zijn bekroond met de Onderwijspremie. Ze delen successen, leerzame valkuilen en uitdagingen die zij de afgelopen jaren zijn tegengekomen bij de doelgerichte inzet van VR in de eigen instellingen &#233;n daarbuiten (o.a. XR Groeiraster). 

Verwacht een scherpe, inspirerende discussie waarin de praktijk centraal staat. Daarbij zoomen we in op overeenkomsten en verschillen tussen sectoren als het gaat om hoe VR (niet) ge&#239;mplementeerd zou moeten worden als het &#233;cht impact wil maken in curricula. De panelleden kijken ook vooruit: Welke nieuwe uitdagingen brengt de nabije toekomst met zich mee? Ook publieksvragen worden beantwoord. Deelnemers vertrekken met frisse idee&#235;n en concrete tips om het volledige potentieel van VR als leeromgeving te kunnen realiseren!

De panelleden zijn:
-	Erik-Jan Smits, Lector Zin in ICT (Christelijke Hogeschool Ede) 
-	Gijs Terlouw, Associate lector Health Innovation &amp; Simulation Learning (NHL Stenden)
-	Jolien Mouw, Onderwijswetenschapper, VR kleuterklas (Rijkuniversiteit Groningen)
-	Roel Peijs, Blended Coach XR (Vista College)/Innovatiehub

Moderator: G&#252;l Akcaova (SURF)</description>
                <recording>
                    <license></license>
                    <optout>false</optout>
                </recording>
                <links></links>
                <attachments></attachments>

                <url>https://pretalx.surf.nl/national-xr-day-2025/talk/XCLAK3/</url>
                <feedback_url>https://pretalx.surf.nl/national-xr-day-2025/talk/XCLAK3/feedback/</feedback_url>
            </event>
            <event guid='6d4159e1-1415-5ae1-8ac1-91f136f830af' id='2516'>
                <room>Theil CB-2 (118p)</room>
                <title>Npuls, DUTCH &amp; CIIIC - Collaborating on XR with Impact - Inclusive, Sustainable and Public Values Driven (panel discussion)</title>
                <subtitle></subtitle>
                <type>Panel discussion</type>
                <date>2025-07-02T14:55:00+02:00</date>
                <start>14:55</start>
                <duration>00:45</duration>
                <abstract>Three Growth Fund programs &#8211; each with their own focus on XR, immersive technology and simulation content &#8211; &#8203;&#8203;are joining forces. In this panel discussion, we will explore how we can make XR technology and content more accessible, inclusive and scalable.

After a brief introduction to the programs, we will discuss public and private collaboration, sustainable implementation, awareness of public values &#8203;&#8203;and opportunities for mutual reinforcement of the various programs. We will also share examples of concrete collaboration, such as the SimuLearn platform.</abstract>
                <slug>national-xr-day-2025-2516-npuls-dutch-ciiic-collaborating-on-xr-with-impact-inclusive-sustainable-and-public-values-driven-panel-discussion</slug>
                <track>Impact</track>
                
                <persons>
                    <person id='269'>Jeffrey Lemmers</person><person id='79'>Esther van der Linde</person><person id='1073'>Heleen Rouw</person>
                </persons>
                <language>en</language>
                <description>Moderator: Annette Langedijk (SURF)</description>
                <recording>
                    <license></license>
                    <optout>false</optout>
                </recording>
                <links></links>
                <attachments></attachments>

                <url>https://pretalx.surf.nl/national-xr-day-2025/talk/UBHJSE/</url>
                <feedback_url>https://pretalx.surf.nl/national-xr-day-2025/talk/UBHJSE/feedback/</feedback_url>
            </event>
            
        </room>
        <room name='Theil C1-1 (60p)' guid='7073b9d0-f8c8-5661-a1d5-b1f1e6ce3064'>
            <event guid='0981074a-bd5c-5692-83de-1854e8417b17' id='2486'>
                <room>Theil C1-1 (60p)</room>
                <title>User Authentication and Security in XR Applications using SURFConext</title>
                <subtitle></subtitle>
                <type>Presentation [20 + 5 Q&amp;A]</type>
                <date>2025-07-02T11:15:00+02:00</date>
                <start>11:15</start>
                <duration>00:25</duration>
                <abstract>Do you recognise this problem? &#8220;I want to identify my students in my XR application, so I know who is doing what&#8221;. Over the past 2 years TU Delft has been evaluating various approaches to implementing user authentication in XR, what works best, any why. In this session, we will share our findings, as well as our plans to work together with SURF to make this easier for you.</abstract>
                <slug>national-xr-day-2025-2486-user-authentication-and-security-in-xr-applications-using-surfconext</slug>
                <track>Technology</track>
                
                <persons>
                    <person id='231'>Luuk Goossen</person>
                </persons>
                <language>en</language>
                <description>There are various ways to handle user authentication, some are considered good practices, some are considered bad practices, but what they all have in common is that none of them were designed with XR technology in mind. However, as XR in education is becoming more mature and the demand for connected applications and data sharing rises, user authentication is becoming a highly requested feature.

In this presentation, you will learn the various approaches to user authentication and single-sign-on (SSO) that we tested over the past 2 years, what their benefits are, what their downsides are, and most importantly, what approach we recommend for SSO in XR Applications with SURFConext.

Since security becomes of paramount importance once you are working with real student data, you will also learn some best practices for building secure and safe connected XR applications in general. This includes topics such as:

- Authentication vs Authorization
- How to connect to databases from XR content
- Secure and insecure contexts
- Data validation and integrity

Our final proposal for an XR-optimised SSO flow is a slightly modified version of the common and well-tested OAuth Device Flow. You will see this flow in action, as we demonstrate how we use this at TU Delft to sign in students with their TU Delft account using SURFConext.

Finally, we will look towards the future. TU Delft is collaborating with SURF and NPuls to further test this proposal, with the goal of providing a standardised solution for handling SSO with SURFConext in XR, including a common authentication server and plug-ins for Unity and Unreal Engine.</description>
                <recording>
                    <license></license>
                    <optout>false</optout>
                </recording>
                <links></links>
                <attachments></attachments>

                <url>https://pretalx.surf.nl/national-xr-day-2025/talk/3BAQAL/</url>
                <feedback_url>https://pretalx.surf.nl/national-xr-day-2025/talk/3BAQAL/feedback/</feedback_url>
            </event>
            <event guid='b3382869-d315-5767-8ea9-6628cb967555' id='2495'>
                <room>Theil C1-1 (60p)</room>
                <title>Npuls XR - XR Mobile Device Management progress update</title>
                <subtitle></subtitle>
                <type>Presentation [20 + 5 Q&amp;A]</type>
                <date>2025-07-02T11:50:00+02:00</date>
                <start>11:50</start>
                <duration>00:25</duration>
                <abstract>Within Npuls we have the track MDM, we know an MDM is very important for getting content to the headset and we would like to share what we have learned a where we are in this project. We will discuss what the market is doing, what possible solutions are and what is next.</abstract>
                <slug>national-xr-day-2025-2495-npuls-xr-xr-mobile-device-management-progress-update</slug>
                <track>Technology</track>
                
                <persons>
                    <person id='339'>Jeroen Boots</person>
                </persons>
                <language>en</language>
                <description>Mobile device management software(MDM) is the last mile of scaling up and using XR. It helps to maintain large amounts of headsets, activate kiosk mode, apply settings and download apps. In short a very useful tool for using XR at scale. 

However a lot is changing in the field of XR and MDM&apos;s. Therefore we are exploring the options that we have as educational institutions to manage our hardware and software. During this presentation we will share the findings in our user study, where we are now in the process and what is coming next.</description>
                <recording>
                    <license></license>
                    <optout>false</optout>
                </recording>
                <links></links>
                <attachments></attachments>

                <url>https://pretalx.surf.nl/national-xr-day-2025/talk/JSNGTE/</url>
                <feedback_url>https://pretalx.surf.nl/national-xr-day-2025/talk/JSNGTE/feedback/</feedback_url>
            </event>
            <event guid='3a961e1f-3e27-5d61-9a3f-b8464b6ea1fe' id='2479'>
                <room>Theil C1-1 (60p)</room>
                <title>High-Fidelity Collaboration in the Metaverse with XR-IT</title>
                <subtitle></subtitle>
                <type>Presentation [20 + 5 Q&amp;A]</type>
                <date>2025-07-02T13:45:00+02:00</date>
                <start>13:45</start>
                <duration>00:25</duration>
                <abstract>XR-IT: a software and systems platform for high-fidelity collaboration in the Metaverse. New tools enabling inter-institutional virtual worlds, for education, research and immersive content creation. Preview a live action mocap animated film made simultaneously between studios in three EU countries using XR-IT.</abstract>
                <slug>national-xr-day-2025-2479-high-fidelity-collaboration-in-the-metaverse-with-xr-it</slug>
                <track>Technology</track>
                
                <persons>
                    <person id='207'>Professsor Dr. Ian Biscoe</person>
                </persons>
                <language>en</language>
                <description>XR-IT has been developed to radically simplify the process of collaboration in shared XR worlds situated across geographically distributed nodes of engagement. It supports the creation and operation of immersive coalesced space by integrating existing computing and specialist XR resources across multiple locations into a seamless cohabited world for creation, education and research.

XR-IT enables remote engagement in complex XR environments through the use of a distributed computing model that supports local instancing of games engines (Unreal, and Unity), and low-latency geographic distribution of mocap data, audio/video streams, camera parameters, facial capture data, etc. using a secure global virtual LAN model of connectivity. 

A recent test case, using XR-IT to create a live motion-capture generated film, working simultaneously with actors and crew in studios in the Netherlands, Finland and Germany, will be presented.  

XR-IT was developed in the Trans Realties Lab (TRL) at Design Academy Eindhoven (DAE) with financial support of the European Union&#8217;s European Media and Immersion Lab programme and Horizon Europe.

https://transrealitieslab.com/</description>
                <recording>
                    <license></license>
                    <optout>false</optout>
                </recording>
                <links></links>
                <attachments></attachments>

                <url>https://pretalx.surf.nl/national-xr-day-2025/talk/WBQYQL/</url>
                <feedback_url>https://pretalx.surf.nl/national-xr-day-2025/talk/WBQYQL/feedback/</feedback_url>
            </event>
            <event guid='4d956f60-310f-5675-8427-b1bc7a57b303' id='2492'>
                <room>Theil C1-1 (60p)</room>
                <title>Co-creation: Make technical jobs more attractive</title>
                <subtitle></subtitle>
                <type>Presentation [20 + 5 Q&amp;A]</type>
                <date>2025-07-02T14:20:00+02:00</date>
                <start>14:20</start>
                <duration>00:25</duration>
                <abstract>In 2024, a new project was launched in which MBO colleges Graafschap College collaborated with STO Arnhem and VRinSCHOOL. To address the challenges of the energy transition and increase the appeal of technical professions, the schools co-created new VR courses.</abstract>
                <slug>national-xr-day-2025-2492-co-creation-make-technical-jobs-more-attractive</slug>
                <track>Collaboration</track>
                
                <persons>
                    <person id='1092'>Ellis Bodde</person><person id='628'>Christel de Winter</person><person id='73'>Tom Aerts</person>
                </persons>
                <language>en</language>
                <description>During the development phase, VRinSCHOOL created 2 courses &#8212; instalment of eletric installations and  solar panels and also contributed 50% of the investment to ensure commitment from all parties involved. Throughout the process, teachers shared their expertise, and VRinSCHOOL developed a detailed script and designed an easy-to-use VR experience. Both courses are now ready to use, and plans are underway to create a new course on instalment of charging stations.</description>
                <recording>
                    <license></license>
                    <optout>false</optout>
                </recording>
                <links></links>
                <attachments></attachments>

                <url>https://pretalx.surf.nl/national-xr-day-2025/talk/QA8HJE/</url>
                <feedback_url>https://pretalx.surf.nl/national-xr-day-2025/talk/QA8HJE/feedback/</feedback_url>
            </event>
            <event guid='261d0222-c355-5cd4-be0c-42ff9f4b01b4' id='2481'>
                <room>Theil C1-1 (60p)</room>
                <title>Implementation of XR in Healthcare and Social Work</title>
                <subtitle></subtitle>
                <type>Presentation [20 + 5 Q&amp;A]</type>
                <date>2025-07-02T14:55:00+02:00</date>
                <start>14:55</start>
                <duration>00:25</duration>
                <abstract>Do you want to innovate education programs in Healthcare and Social Work with XR, but have no idea where to start? Or would you like to learn from experiences from the implementation of XR at Saxion University of Applied Sciences? Then come to the presentation on &#8216;Implementation of XR in Healthcare and Social Work&#8217;!</abstract>
                <slug>national-xr-day-2025-2481-implementation-of-xr-in-healthcare-and-social-work</slug>
                <track>Impact</track>
                
                <persons>
                    <person id='292'>Richard Evering</person>
                </persons>
                <language>en</language>
                <description>Saxion started in 2023 with the XR-hub in Healthcare &amp; Social Work from which XR is systematically implemented in educational programs of the academies of Healthcare and Social Work. Prerequisite was a strong vision and mission from the organization about innovation with XR including structural financial resources and support from academies involved. Subsequently, the XR-hub is developed step-by-step:
1.	Bringing together teachers and researchers involved in existing initiatives to further develop the vision and mission of the XR-hub. 
2.	Setting up a project group including linking pins to involved educational programs and a steering committee.
3.	Sharpening of the vision and mission with the project team and drawing up an annual program of activities. In addition, setting up a communication and collaboration environment.
4.	Emphasizing the added value of XR by joining curricula committees, and organizing inspiration and practice sessions with teachers.
5.	Continuously collecting current developments in the field of XR for implementation in education.
6.	Setting up a 3-phase strategy for curriculum renewal with XR: 1) XR-hub takes care of the implementation, the teacher observes; 2) XR education is offered jointly by the teacher and the XR-hub; 3) the teacher uses XR independently, XR-hub only facilitates technology and space.
By going through the above steps, XR has been successfully implemented. Teachers and students have positive experiences. A number of points for improvement have also emerged: 1) securing implementation in phase 3; 2) the didactic approach in using multiple XR glasses simultaneously; 3) organizing a lesson within the usual time frame. In the coming period, focus will be on research into the effectiveness and efficiency of implementing XR in order to contribute to knowledge development and innovation of educational programs with XR.</description>
                <recording>
                    <license></license>
                    <optout>false</optout>
                </recording>
                <links></links>
                <attachments></attachments>

                <url>https://pretalx.surf.nl/national-xr-day-2025/talk/ALRLN7/</url>
                <feedback_url>https://pretalx.surf.nl/national-xr-day-2025/talk/ALRLN7/feedback/</feedback_url>
            </event>
            
        </room>
        <room name='Theil C1-2 (60p)' guid='0b5c9d8b-faa1-5155-8f9b-29f498ba80f0'>
            <event guid='3789e573-fbe0-58bc-838d-dd6ad2009645' id='2505'>
                <room>Theil C1-2 (60p)</room>
                <title>Guiding the Green Mind through Personality-based Feedback in VR</title>
                <subtitle></subtitle>
                <type>Presentation [20 + 5 Q&amp;A]</type>
                <date>2025-07-02T11:15:00+02:00</date>
                <start>11:15</start>
                <duration>00:25</duration>
                <abstract>This study explores how personality and feedback influence eco-friendly decisions in immersive VR. By simulating online shopping, it aims to unlock personalized nudges that drive sustainable consumption. The findings offer a new approach to designing effective, tailored interventions that can shape real-world behavior and boost our efforts toward sustainability.</abstract>
                <slug>national-xr-day-2025-2505-guiding-the-green-mind-through-personality-based-feedback-in-vr</slug>
                <track>Impact</track>
                
                <persons>
                    <person id='243'>Funda Yildirim</person>
                </persons>
                <language>en</language>
                <description>This study investigates how personality traits and feedback types influence eco-friendly decisions in immersive virtual environments. We examine how positive and negative feedback interact with personality factors such as locus of control and behavioral inhibition/activation systems (BIS/BAS). BIS individuals are sensitive to negative cues, while BAS individuals are driven by rewards. Additionally, those with a high internal locus of control tend to make more sustainable choices.

Participants engage in a VR shopping task where their selections impact a virtual environment. Eco-friendly choices improve the environment, while unsustainable ones cause degradation, providing immediate feedback.

By combining VR with psychological profiling, we gain insights into how feedback and personality influence decision-making. These findings offer valuable guidance for designing personalized interventions that promote sustainable behavior and inform targeted marketing and policy strategies.</description>
                <recording>
                    <license></license>
                    <optout>false</optout>
                </recording>
                <links></links>
                <attachments></attachments>

                <url>https://pretalx.surf.nl/national-xr-day-2025/talk/QRFBRH/</url>
                <feedback_url>https://pretalx.surf.nl/national-xr-day-2025/talk/QRFBRH/feedback/</feedback_url>
            </event>
            <event guid='7dee3e91-3f3f-5af8-940d-3faf59f961ae' id='2483'>
                <room>Theil C1-2 (60p)</room>
                <title>Promoting Pro-environmental Behavior through Immersive Animal Perspective-Taking</title>
                <subtitle></subtitle>
                <type>Presentation [20 + 5 Q&amp;A]</type>
                <date>2025-07-02T11:50:00+02:00</date>
                <start>11:50</start>
                <duration>00:25</duration>
                <abstract>This study examined how VR perspective-taking as animals promotes pro-environmental behaviors. Participants (N=69) experienced life as a deer facing environmental challenges through either VR (N=35) or video-watching (N=34). We measured multi-modal self-reported attitudes, biofeedback, and donation behaviors. Results showed that sense of presence mediated the relationship between stimulus type and donation behavior, suggesting VR&apos;s immersive qualities enhance environmental behavior through embodied animal experiences.</abstract>
                <slug>national-xr-day-2025-2483-promoting-pro-environmental-behavior-through-immersive-animal-perspective-taking</slug>
                <track>Impact</track>
                
                <persons>
                    <person id='1011'>Yue Li</person>
                </persons>
                <language>en</language>
                <description>Despite extensive efforts to promote pro-environmental behaviors, behavioral adoption remains challenged by knowledge gaps and insufficient emotional engagement. Visual perspective-taking in immersive contexts offers a promising solution by reducing psychological distance between individuals and environmental situations, potentially bridging understanding and action.
This study explores how animal perspective-taking experiences in Virtual Reality (VR) influence pro-environmental attitudes and behavioral changes. Participants adopted a deer&apos;s perspective, experiencing its daily life through wandering, foraging, and social interactions while witnessing environmental challenges including forest fires and human rescue efforts.
Our experiment (N=69) used a between-subjects design comparing first-person VR perspective-taking with traditional video-watching. We measured self-reported animal attitudes and ecological behaviors pre-intervention and one week post-intervention. During the experiment, we recorded heart rate and behavioral data, along with immediate post-intervention measures of empathy and sense of presence. Following the session, participants received &#8364;5.00 (ten 50-cent coins) as compensation and could donate any portion to local or global environmental protection organizations, with donation behavior recorded as a key outcome measure.
Results showed that both VR and video groups reported increased environmental behavior one week later, indicating lasting influence (p &lt; .001). However, donation behavior was significantly influenced by sense of presence (p &lt; .001), which was significantly higher in the VR group (p &lt; .001). Mediation analysis revealed that sense of presence significantly mediates 70% of the total effect (p &lt;.05), suggesting VR&apos;s immersive qualities enhance environmental behavior through embodied animal experiences. Additionally, role adoption and empathy were influenced by pro-environmental attitudes (p &lt; .05), while inclusion level was affected by both stimulus condition and role adoption (p &lt; .01).
These preliminary findings suggest that while both stimuli produced pro-environmental changes, immersive ecological perspective-taking more effectively promotes pro-environmental behavioral outcomes, particularly for immediate effects like charitable giving.</description>
                <recording>
                    <license></license>
                    <optout>false</optout>
                </recording>
                <links></links>
                <attachments></attachments>

                <url>https://pretalx.surf.nl/national-xr-day-2025/talk/HVNJNC/</url>
                <feedback_url>https://pretalx.surf.nl/national-xr-day-2025/talk/HVNJNC/feedback/</feedback_url>
            </event>
            <event guid='e1471c1d-be6d-5858-9096-accba647c02c' id='2497'>
                <room>Theil C1-2 (60p)</room>
                <title>Co-Creating an XR App Store for Education in the Netherlands</title>
                <subtitle></subtitle>
                <type>Presentation [20 + 5 Q&amp;A]</type>
                <date>2025-07-02T13:45:00+02:00</date>
                <start>13:45</start>
                <duration>00:25</duration>
                <abstract>Within Npuls, we are co-developing Simulearn: a supplier-independent XR platform for Dutch education in collaboration with DUTCH and SURF. Based on extensive co-creation, it tackles challenges like fragmented content, LMS integration, device management , and above all: enabling education professionals to start doing and experiment with XR right away.</abstract>
                <slug>national-xr-day-2025-2497-co-creating-an-xr-app-store-for-education-in-the-netherlands</slug>
                <track>Impact</track>
                
                <persons>
                    <person id='1035'>Jeroen Kelder</person>
                </persons>
                <language>en</language>
                <description>What can you expect? A live demo of the Proof of Concept &apos;Simulearn&apos; platform and its key functionalities. We&#8217;ll take you behind the scenes of our collaborative journey, from early co-creation workshops and use case exploration to the cross-organisational partnership between Npuls, DUTCH, and SURF. Simulearn is being developed as part of Npuls&#8217; mission to accelerate digital transformation in Dutch education, with a strong focus on public values and practical usability. Along the way, we&#8217;ve tackled critical obstacles such as the absence of a central, high-quality XR content hub, the complexity of managing devices across institutions, and the need for a platform that enables sharing of user experiences and good practices. 

This session is for educators, policymakers, and tech innovators who want to see how XR can be embedded meaningfully in everyday education, and who want to help shape the future of immersive learning in the Netherlands.</description>
                <recording>
                    <license></license>
                    <optout>false</optout>
                </recording>
                <links></links>
                <attachments></attachments>

                <url>https://pretalx.surf.nl/national-xr-day-2025/talk/XKKHQE/</url>
                <feedback_url>https://pretalx.surf.nl/national-xr-day-2025/talk/XKKHQE/feedback/</feedback_url>
            </event>
            <event guid='352cb374-080d-5e9e-ac29-0ebfad4fa739' id='2488'>
                <room>Theil C1-2 (60p)</room>
                <title>Device Management and Content Sharing with the XRScaleKit</title>
                <subtitle></subtitle>
                <type>Presentation [20 + 5 Q&amp;A]</type>
                <date>2025-07-02T14:20:00+02:00</date>
                <start>14:20</start>
                <duration>00:25</duration>
                <abstract>Managing XR Devices at scale is difficult, especially with all the changes in providers over the last year. Moreover, none of the existing solutions are build around community and sharing content. With the XRScaleKit Library and Launcher the TU Delft aims to change that. In this presentation, we will share progress updates on the various tools in the XRScaleKit.</abstract>
                <slug>national-xr-day-2025-2488-device-management-and-content-sharing-with-the-xrscalekit</slug>
                <track>Collaboration</track>
                
                <persons>
                    <person id='1023'>Arno Freeke</person><person id='231'>Luuk Goossen</person>
                </persons>
                <language>en</language>
                <description>The XRScaleKit is a collection of projects and concepts that resulted in a toolkit for scalable XR development. At TU Delft, we are working hard to test and finetune these tools ahead of publishing them as Open Source Software. A key goal of the XRScaleKit is to make it easier to find, share and manage XR content at your institution.

In this presentation, you will get a quick overview of what the XRScaleKit is and what tools it contains, including progress updates since the last National XR Day and insight into the roadmap. Then, you will learn how the combination of the XRScaleKit Library and XRScaleKit Launcher work together to make it easy to get an overview of all XR content available at an institution.

The XRScaleKit Library contains features like:

- A website to manage and display your XR content
- A way to manage your XR devices
- A way to manage XR device reservations and prepare them for lessons
- A roadmap to share content using the SURF Content Exchange Platform (CXP)

The XRScaleKit Launcher contains features like:

- A way to find and install content on your XR device
- A way to start content remotely
- Details and usage statistics of the XR device

Both feature:

- Open Source Software
- An extensive API to integrate these tools with other platforms

We will end the presentation with a live demo showcasing our currently available content and how to add new content to the XRScaleKit Library and share it with the community. To boost sharing content in our community, the TU Delft will share a large selection of our custom content as well.</description>
                <recording>
                    <license></license>
                    <optout>false</optout>
                </recording>
                <links></links>
                <attachments></attachments>

                <url>https://pretalx.surf.nl/national-xr-day-2025/talk/NEE8B7/</url>
                <feedback_url>https://pretalx.surf.nl/national-xr-day-2025/talk/NEE8B7/feedback/</feedback_url>
            </event>
            <event guid='44ff552a-87ba-54af-88cb-e92fb5061cca' id='2494'>
                <room>Theil C1-2 (60p)</room>
                <title>Towards an Open 3D asset platform</title>
                <subtitle></subtitle>
                <type>Birds of a feather</type>
                <date>2025-07-02T14:55:00+02:00</date>
                <start>14:55</start>
                <duration>00:45</duration>
                <abstract>Curious how to manage and share 3D content for XR in a visually attractive yet sustainable and interoperable way at an institution or university? Join us to explore a TU Delft proof of concept using Smithsonian Voyager, linked via API to the university repository. Models get DOIs, SSO, and work in web, Unity, or Unreal. Let&#8217;s discuss next steps&#8212;your input shapes the future!</abstract>
                <slug>national-xr-day-2025-2494-towards-an-open-3d-asset-platform</slug>
                <track>Collaboration</track>
                
                <persons>
                    <person id='1033'>Roland van Roijen</person><person id='1077'>Wouter Lucifer</person><person id='1079'>Vincent Cellucci</person>
                </persons>
                <language>en</language>
                <description>Do you need an attractive, reliable way to manage and share 3D models for XR use in your institution or lab? We do too&#8230;

In this open session, you&apos;ll explore a prototype of an open 3D asset platform at TU Delft by the NewMedia Centre and Library Learning Centre. This platform connects Smithsonian&#8217;s Voyager 3D viewer with the TU Delft Repository via a custom API. When users upload models, the system uses Surf SSO to handle some governance and the API provides an opportunity to link 3D assets to related publications, departments and DOIs for referencing.

This session gives you a chance to:
&#8226;	See a working demo of OPEN3D, including Voyager with live 3D annotations and storytelling
&#8226;	Discover how the platform enables reuse on the web &amp; XR tools like Unity and Unreal
&#8226;	Think more broadly about presentation and curation possibilities for 3D assets
&#8226;	Ask questions, share your own challenges, and contribute ideas for improvement or cooperation  
This session is beneficial if you&#8217;re wondering how to integrate 3D pipelines into XR workflows, ensure long-term access to 3D assets, or support your heritage/science content in XR with additional content through the API.

Bring your thoughts, use cases, plans or problems&#8212;and let&#8217;s build something useful together.

Who it&apos;s for:
If you&apos;re into XR design or development and work with 3D models or scans at a university or institution, this is for you. Same goes for folks in research infrastructure, heritage orgs with 3D scans, educators, data stewards, or anyone into digital preservation or curation&#8212;especially if you&apos;re looking for real, workable open 3D/XR solutions.</description>
                <recording>
                    <license></license>
                    <optout>false</optout>
                </recording>
                <links></links>
                <attachments></attachments>

                <url>https://pretalx.surf.nl/national-xr-day-2025/talk/7TCVSG/</url>
                <feedback_url>https://pretalx.surf.nl/national-xr-day-2025/talk/7TCVSG/feedback/</feedback_url>
            </event>
            
        </room>
        <room name='Theil C1-3 (60p)' guid='9c37491f-53a2-5bac-8e4b-0aa93e08a0c6'>
            <event guid='0cff8ad8-9941-5675-b93c-50836027dfc3' id='2506'>
                <room>Theil C1-3 (60p)</room>
                <title>VR-Based Experiential Learning: Enhancing Understanding of Health Inequalities in Higher Education</title>
                <subtitle></subtitle>
                <type>Presentation [20 + 5 Q&amp;A]</type>
                <date>2025-07-02T11:15:00+02:00</date>
                <start>11:15</start>
                <duration>00:25</duration>
                <abstract>This session presents a VR-based experiential learning workshop designed to teach psychology students about systemic and community-level barriers to mental health. Evaluation and exam results suggest that immersive learning enhances understanding and retention of health inequalities and may reduce disparities in student performance.</abstract>
                <slug>national-xr-day-2025-2506-vr-based-experiential-learning-enhancing-understanding-of-health-inequalities-in-higher-education</slug>
                <track>Impact</track>
                
                <persons>
                    <person id='244'>Jeanette Hadaschik</person><person id='1045'>Nina Krohne</person>
                </persons>
                <language>en</language>
                <description>Can VR-based experiential learning foster empathy and perspective-taking to improve students&apos; understanding of complex social issues?

We present a VR-based workshop concept designed to help bachelor&apos;s and master&apos;s psychology students understand how systemic and community-level barriers contribute to socio-economic mental health disparities. In five workshops, students were briefly immersed in contrasting VR neighbourhoods&#8212;half experienced a deprived, the other half a middle-class setting&#8212;and asked to imagine living in the community while being the target of a mental health intervention (e.g. an online mindfulness programme). They reflected on how neighbourhood conditions might support or hinder their ability to benefit from the intervention and influence required behaviour change.

Afterwards, students participated in guided group reflection, using their VR experiences to explore how community-level factors (e.g. crime, safety, social cohesion, access to green space) affect the success of interventions&#8212;and how these may unintentionally reinforce inequalities by favouring those in middle-class environments.

Evaluation data from 24 bachelor&#8217;s and 15 master&#8217;s students showed high scores in empathy, systemic understanding, and the perceived value of VR for learning about health inequalities. Notably, there were no differences between students who experienced the deprived versus the middle-class VR neighbourhood. 

We also analysed exam performance from two sessions of third-year bachelor students (n=15 and n=16) in the Public Mental Health module. The question directly related to the VR workshop had the highest average score (93.75%) and a low standard deviation (0.34), indicating strong retention and reduced variability in student performance compared to questions based on traditional teaching methods.

These findings suggest that immersive VR learning not only deepens understanding of complex social issues, but may also promote more equitable learning outcomes.</description>
                <recording>
                    <license></license>
                    <optout>false</optout>
                </recording>
                <links></links>
                <attachments></attachments>

                <url>https://pretalx.surf.nl/national-xr-day-2025/talk/LYLUWA/</url>
                <feedback_url>https://pretalx.surf.nl/national-xr-day-2025/talk/LYLUWA/feedback/</feedback_url>
            </event>
            <event guid='a9d893e1-7cd5-5c33-b38d-4fa56167a8ea' id='4201'>
                <room>Theil C1-3 (60p)</room>
                <title>(CANCELLED) XR-enhanced learning in healthcare: four Experimental Labs as drivers of innovation</title>
                <subtitle></subtitle>
                <type>Presentation [20 + 5 Q&amp;A]</type>
                <date>2025-07-02T11:50:00+02:00</date>
                <start>11:50</start>
                <duration>00:25</duration>
                <abstract>Note: unfortunately, this presentation has been cancelled

In this presentation, I&#8217;ll take you inside the four cutting-edge Experimental Labs launched in 2025 under the DUTCH program. These labs test VR, AR, and other XR tools in real-world settings&#8212;bridging innovation and education to build a smarter, faster, and more hands-on way to train future healthcare professionals.</abstract>
                <slug>national-xr-day-2025-4201-cancelled-xr-enhanced-learning-in-healthcare-four-experimental-labs-as-drivers-of-innovation</slug>
                <track>Impact</track>
                
                <persons>
                    <person id='1105'>Femke Dijkstra</person>
                </persons>
                <language>en</language>
                <description>To address the growing demand for well-trained healthcare professionals&#8212;such as operating room assistants, anesthesia technicians, and radiology technologists&#8212; it is essential to expand educational capacity through the integration of emerging learning technologies. To accelerate this process, four Experimental Labs have been established in early 2025 as part of DUTCH.DUTCH is a program funded by the Dutch National Growth Fund. These labs serve as a vital link between innovation and real-world implementation.

In this presentation, I will provide an inside look into the four labs and how they contribute to the development of an adaptive learning platform for healthcare education. The labs act as experimental spaces where existing and newly developed XR-based learning tools are tested in realistic settings for their educational value, usability, and scalability. Examples include VR surgical simulations, AR-supported training modules, and physical simulators integrated with digital feedback systems.

The goal is to create an effective and scalable learning platform that enables students to train faster, more effectively, and with greater exposure to practical, hands-on experiences. The approach is interdisciplinary: educators, technology partners, and healthcare institutions work closely together to validate, refine, and integrate these learning tools into the educational practice.</description>
                <recording>
                    <license></license>
                    <optout>false</optout>
                </recording>
                <links></links>
                <attachments></attachments>

                <url>https://pretalx.surf.nl/national-xr-day-2025/talk/7KNKXU/</url>
                <feedback_url>https://pretalx.surf.nl/national-xr-day-2025/talk/7KNKXU/feedback/</feedback_url>
            </event>
            <event guid='7061f06b-990d-5308-9531-e2fdb0ecbcaa' id='2524'>
                <room>Theil C1-3 (60p)</room>
                <title>Unite! Professional Learning Group to promote immersive learning in Higher Education</title>
                <subtitle></subtitle>
                <type>Presentation [20 + 5 Q&amp;A]</type>
                <date>2025-07-02T13:45:00+02:00</date>
                <start>13:45</start>
                <duration>00:25</duration>
                <abstract>How do you keep immersive learning alive in times of budget cuts, GenAI FOMO, busy schedules and Big Tech monopolies? Learn how pioneering lecturers and support staff of Hogeschool Rotterdam come together in a professional learning community. Together they learn new skills, discuss their experiences and even create DIY XR learning interventions.</abstract>
                <slug>national-xr-day-2025-2524-unite-professional-learning-group-to-promote-immersive-learning-in-higher-education</slug>
                <track>Impact</track>
                
                <persons>
                    <person id='1051'>Maaike Compagnie</person>
                </persons>
                <language>en</language>
                <description>Immersive learning, if done correctly, promotes deeper understandings of the subject matter. Unfortunately, gaining access to high quality content can be a challenge. It may be expensive or not tailored to the specific context. At Hogeschool Rotterdam we started a Professional Learning Group (PLG) to create our own content. A group of pioneering lecturers and support staff come together to learn new skills (step 1), discuss experiences (step 2) and to build a XR learning intervention like 360 video or AR application (step 3). Every member has their own educational challenge to tackle, but it is a journey we take on together. We communicate our findings to the rest of Hogeschool Rotterdam via podcasts and posters. Learn what we learned and join us in this interactive session!</description>
                <recording>
                    <license></license>
                    <optout>false</optout>
                </recording>
                <links></links>
                <attachments></attachments>

                <url>https://pretalx.surf.nl/national-xr-day-2025/talk/HWKEWT/</url>
                <feedback_url>https://pretalx.surf.nl/national-xr-day-2025/talk/HWKEWT/feedback/</feedback_url>
            </event>
            <event guid='a514d0ae-d567-5a59-ac51-abf530224188' id='2529'>
                <room>Theil C1-3 (60p)</room>
                <title>&#127475;&#127473; Getijden: Een multi-disciplinaire residentie als immersieve lesvorm</title>
                <subtitle></subtitle>
                <type>Presentation [20 + 5 Q&amp;A]</type>
                <date>2025-07-02T14:20:00+02:00</date>
                <start>14:20</start>
                <duration>00:25</duration>
                <abstract>Geen rooster, geen verschillende vakken of instructie-momenten maar twee weken lang samen met studenten en docenten van andere afdelingen op locatie werken aan een interactieve multi-sensorische installatie met als finale geen beoordeling maar een expositie.</abstract>
                <slug>national-xr-day-2025-2529-getijden-een-multi-disciplinaire-residentie-als-immersieve-lesvorm</slug>
                <track>Impact</track>
                
                <persons>
                    <person id='1103'>Oskar Maan</person>
                </persons>
                <language>en</language>
                <description>In de huidige &#8220;war for attention&#8221; strijden bedrijven op verschillende niveaus om de aandacht van de klanten (lees: studenten), in de huidige vorm van het onderwijs verliezen we die strijd. Het verbieden van social media of telefoons in de les is het bestrijden van symptomen.

Getijden is een pilot waarin we met studenten van het Grafisch Lyceum Rotterdam twee weken full-time op locatie hebben gewerkt. In co-creatie met docenten is gewerkt aan een location based immersieve installatie. Alle studenten werden twee weken volledig uitgeroosterd, wat op het MBO vrijwel nooit voorkomt, om met studenten van vier verschillende afdelingen samen te werken.</description>
                <recording>
                    <license></license>
                    <optout>false</optout>
                </recording>
                <links></links>
                <attachments></attachments>

                <url>https://pretalx.surf.nl/national-xr-day-2025/talk/DRMQU7/</url>
                <feedback_url>https://pretalx.surf.nl/national-xr-day-2025/talk/DRMQU7/feedback/</feedback_url>
            </event>
            <event guid='2f5f1228-8de7-5144-9ed3-39b5579e1a76' id='2499'>
                <room>Theil C1-3 (60p)</room>
                <title>VR in the Library: Mindfulness Meets Privacy</title>
                <subtitle></subtitle>
                <type>Presentation [20 + 5 Q&amp;A]</type>
                <date>2025-07-02T14:55:00+02:00</date>
                <start>14:55</start>
                <duration>00:25</duration>
                <abstract>How can university libraries offer VR headsets for wellbeing while safeguarding privacy and data security? This session provides a practical roadmap for responsibly implementing VR in public spaces, using the Maastricht University Library&#8217;s approach implementing privacy by default.</abstract>
                <slug>national-xr-day-2025-2499-vr-in-the-library-mindfulness-meets-privacy</slug>
                <track>Impact</track>
                
                <persons>
                    <person id='1038'>Katja Shcherbakova</person><person id='1091'>Meike Kerkhofs</person>
                </persons>
                <language>en</language>
                <description>Curious how your institution can offer VR for wellbeing by taking into account user privacy? You&#8217;re not alone. With the increasing interest in VR headsets for education and mental health, libraries face the challenge of creating safe, inclusive spaces that balance innovation with responsibility. 

In this session, you will learn how Maastricht University Library designed a VR offering that supports student wellbeing &#8212; without ignoring privacy and GDPR concerns. Discover how the library applies the privacy by default approach to make immersive technologies like the Meta Quest 3 available in open spaces while keeping user and environmental data protected. From policy development and secure implementation to ethical decision-making, you&apos;ll gain insight into each step of the roadmap. 

Specifically, you&#8217;ll walk away with: 

A clear understanding of how to apply the privacy by default principles to public VR use 

Practical considerations for implementing VR in semi-open environments 

Lessons from an academic library balancing accessibility, wellbeing, and data protection 

This session is especially relevant for library professionals, data stewards, digital learning coordinators, and anyone working with XR or wellbeing in higher education. If you&apos;re navigating similar questions about how to responsibly integrate new technologies in your institution, this session offers concrete examples, pitfalls to avoid, and inspiration to get started. 

You&#8217;ll also hear early feedback from end users and learn how the project continues to evolve &#8212; providing a real-world look at the future of wellbeing in academic libraries.</description>
                <recording>
                    <license></license>
                    <optout>false</optout>
                </recording>
                <links></links>
                <attachments></attachments>

                <url>https://pretalx.surf.nl/national-xr-day-2025/talk/SKGGKD/</url>
                <feedback_url>https://pretalx.surf.nl/national-xr-day-2025/talk/SKGGKD/feedback/</feedback_url>
            </event>
            
        </room>
        <room name='Theil C1-4 (60p)' guid='c5efa9f0-c775-57c7-9601-cd6bc9f8bb0e'>
            <event guid='25a5f535-837d-5450-9d17-eef6248f05f4' id='2477'>
                <room>Theil C1-4 (60p)</room>
                <title>Exploring the future of XR for transport and beyond</title>
                <subtitle></subtitle>
                <type>Presentation [20 + 5 Q&amp;A]</type>
                <date>2025-07-02T11:15:00+02:00</date>
                <start>11:15</start>
                <duration>00:25</duration>
                <abstract>Do you want to know how XR is going to help revolutionizing the future of transportation? Don&apos;t miss this session! You&apos;ll learn how Human Factors, social science, applied psychology, cognitive research, immersive systems, and XR development together drive research and education in human interaction within transport environments.</abstract>
                <slug>national-xr-day-2025-2477-exploring-the-future-of-xr-for-transport-and-beyond</slug>
                <track>Impact</track>
                
                <persons>
                    <person id='24'>Jan-Willem van &apos;t Klooster</person><person id='1026'>Simone Borsci</person>
                </persons>
                <language>en</language>
                <description>How do humans interact with complex systems like automated and non-automated vehicles? How can AI and robotics assist drivers and passengers? And how can you explore these interactions in a realistic, immersive way?

We are exploring that at the Oost Simulation Centre, supported by the University of Twente BMS LAB and Green Dino.

You are invited to join us in exploring a frontier of simulation-based social science research. Launched in 2025, this national initiative brings together top-tier cognitive science expertise, companies and institutions and advanced XR simulation technology, offering a unique space to investigate how people behave in high-tech environments&#8212;whether in cars, trains, or digital worlds.

Through access to fully equipped simulators and immersive tools, you dive into real-time scenarios that reveal how humans engage with digital systems. From piloting simulations to broader educational XR activities in transport settings, your perspective on technology&apos;s role in society moves forward.

By the end of this session, you:

-	Understand how simulation and XR technology can be used to study real-world human behavior in controlled, repeatable environments, both in educational and in research projects.
-	Gain insight into designing human-centered systems, from transportation to AI interfaces.
-	Discover how simulation research can directly involve and benefit citizens through technology and public engagement.
-	Have insight on how you can measure the above.

Explore the Oost Simulation Hub and connect with a growing network of experts in transport, higher education, industry, and beyond. This is your paved way to next-level multidisciplinary simulation research and education!</description>
                <recording>
                    <license></license>
                    <optout>false</optout>
                </recording>
                <links></links>
                <attachments></attachments>

                <url>https://pretalx.surf.nl/national-xr-day-2025/talk/UCKRVR/</url>
                <feedback_url>https://pretalx.surf.nl/national-xr-day-2025/talk/UCKRVR/feedback/</feedback_url>
            </event>
            <event guid='a8d24549-7df3-53dd-b7f9-a8c60d8658dd' id='2453'>
                <room>Theil C1-4 (60p)</room>
                <title>&#127475;&#127473; Exergaming met XR: Impact op Beweeggedrag en Gezondheid</title>
                <subtitle></subtitle>
                <type>Presentation [20 + 5 Q&amp;A]</type>
                <date>2025-07-02T11:50:00+02:00</date>
                <start>11:50</start>
                <duration>00:25</duration>
                <abstract>Dit onderzoek uitgevoerd i.s.m. studenten toont aan dat XR en volledig lichaamstracking via exergaming (combinatie van exercise &amp; gaming) fysieke activiteit en een positieve houding t.o.v. sport kunnen bevorderen. In de Active Esports Arena (AEA) vonden deelnemers de technologie motiverend en gebruiksvriendelijk, wat wijst op potentieel voor een actievere levensstijl, vooral bij jongvolwassenen.</abstract>
                <slug>national-xr-day-2025-2453-exergaming-met-xr-impact-op-beweeggedrag-en-gezondheid</slug>
                <track>Impact</track>
                
                <persons>
                    <person id='989'>Bertrik Berfelo</person><person id='996'>Jasper van Wetten</person>
                </persons>
                <language>en</language>
                <description>Dit onderzoek richtte zich op de invloed van Extended Reality (XR) en volledig lichaamstracking op sportdeelname en houding ten opzichte van sport. XR omvat Virtual Reality (VR), Augmented Reality (AR) en Mixed Reality (MR), technologie&#235;n die fysieke en virtuele werelden combineren. Door exergaming, een combinatie van exercise en gaming, wordt verwacht dat fysieke activiteit en conditie worden bevorderd.
In de testcase werd de Active Esports Arena (AEA) van PWXR ingezet. Bijna 74% vond dat de AEA sport laagdrempelig maakt, en 90% beoordeelde de gebruiksvriendelijkheid hoog. Ruim 78% van de deelnemers had nooit eerder fysiek gamen ervaren, en 91% kende de term exergaming niet.
De intensiteit werd door 42% als gemiddeld en door 37% als hoog beoordeeld. Exergaming motiveerde 57% van de respondenten tot meer beweging. Belangrijke motivaties waren lagere kosten (31%) en betere toegang tot apparatuur (30%), met voordelen zoals plezier in sporten (38%) en verhoogde motivatie (28%). Verwachte effecten waren betere fitheid, co&#246;rdinatie, mentale gezondheid en gewichtsverlies.
VR fitness games waren het populairst, gevolgd door AR sport games en dans- en ritmegames. VR is de meest gewilde technologie, gevolgd door AR en MR. Conclusie: XR-gebaseerde exergaming is veelbelovend om fysieke activiteit te bevorderen en een gezondere levensstijl te stimuleren, vooral onder jongvolwassenen. De AEA bleek veilig, leuk en effectief.</description>
                <recording>
                    <license></license>
                    <optout>false</optout>
                </recording>
                <links></links>
                <attachments></attachments>

                <url>https://pretalx.surf.nl/national-xr-day-2025/talk/MHQBWZ/</url>
                <feedback_url>https://pretalx.surf.nl/national-xr-day-2025/talk/MHQBWZ/feedback/</feedback_url>
            </event>
            <event guid='02ea2446-802e-579a-a75c-d95382644bbd' id='2454'>
                <room>Theil C1-4 (60p)</room>
                <title>&#127475;&#127473; Een inspirerend toekomstbeeld: XR in het vervolgonderwijs van 2032</title>
                <subtitle></subtitle>
                <type>Birds of a feather</type>
                <date>2025-07-02T13:45:00+02:00</date>
                <start>13:45</start>
                <duration>00:45</duration>
                <abstract>Vanuit de pilothub XR van Npuls hebben we een inspirerend toekomstbeeld voor XR in het vervolgonderwijs van 2032 geschreven. Dit hebben we gedaan door input op te halen bij Npuls, onderwijsinstellingen en andere partners en deze te verwerken in een korte beschrijving en visuals. De resultaten bespreken we graag met de XR-community en ook hoe we hier in gezamenlijkheid vervolg aan kunnen geven.</abstract>
                <slug>national-xr-day-2025-2454-een-inspirerend-toekomstbeeld-xr-in-het-vervolgonderwijs-van-2032</slug>
                <track>Impact</track>
                
                <persons>
                    <person id='339'>Jeroen Boots</person><person id='995'>Angel Schols</person><person id='1009'>Diane Smits</person>
                </persons>
                <language>en</language>
                <description>Vanuit de pilothub XR van Npuls hebben we een inspirerend toekomstbeeld voor XR in het vervolgonderwijs van 2032 geschreven. Dit hebben we gedaan door input op te halen bij Npuls, onderwijsinstellingen en partners zoals SURF, DUTCH, OASIS en deze te verwerken in een korte beschrijving en visuals. In een interactieve sessie met de XR-community bespreken we graag de resultaten. We zullen deze presenteren en vervolgens het gesprek aangaan over dit beeld. Tevens willen we met de groep bespreken wat er nodig is om dit beeld ook te gaan verwezenlijken. Centraal in de sessie staat dan ook de vraag: Wat vinden jullie van dit beeld en hoe gaan we dit verwezenlijken? 
Voor deze sessie heb je geen specifieke XR-expertise nodig. De sessie is zeer geschikt voor professionals die zich bezig houden met een gedegen toepassing van XR in het vervolgonderwijs en de toekomst van het vervolgonderwijs. Alle drie de sprekers zijn naast Npuls ook werkzaam voor een onderwijsinstelling in het MBO, HBO en WO.</description>
                <recording>
                    <license></license>
                    <optout>false</optout>
                </recording>
                <links></links>
                <attachments></attachments>

                <url>https://pretalx.surf.nl/national-xr-day-2025/talk/DBRNFG/</url>
                <feedback_url>https://pretalx.surf.nl/national-xr-day-2025/talk/DBRNFG/feedback/</feedback_url>
            </event>
            <event guid='b585cb34-b515-5733-aa47-8ec941f8850c' id='4210'>
                <room>Theil C1-4 (60p)</room>
                <title>Added Value of AI based Mixed-Reality in Cranial Neurosurgery</title>
                <subtitle></subtitle>
                <type>Presentation [20 + 5 Q&amp;A]</type>
                <date>2025-07-02T14:55:00+02:00</date>
                <start>14:55</start>
                <duration>00:25</duration>
                <abstract>Accurate three-dimensional (3-D) appreciation of cranial anatomy is essential yet difficult to achieve with conventional two-dimensional (2-D) imaging. Lumi, an AI-driven, cloud-based mixed-reality (MxR) platform that originated from and is studied in close cooperation with the UMC Utrecht, automatically converts DICOM data into patient-specific holograms that can be explored on a head-mounted display or 2-D screen. In this presentation we summarise the current bench-to-bedside evidence base to quantify Lumi&#8217;s effect on spatial insight, surgical planning and task performance.</abstract>
                <slug>national-xr-day-2025-4210-added-value-of-ai-based-mixed-reality-in-cranial-neurosurgery</slug>
                <track>Technology</track>
                
                <persons>
                    <person id='1124'>Tristan van Doormaal</person>
                </persons>
                <language>en</language>
                <description>Methods
Evidence was drawn from an ex-vivo phantom study (n = 36), five prospective clinical series (n = 202) and a randomised crossover phantom trial of external-ventricular-drain (EVD) placement (n = 236). The predefined clinical end-point was the proportion of cases in which the surgical plan (derived from 2-D imaging) was meaningfully modified after review of the Lumi hologram in Mixed Reality (change in positioning, incision, craniotomy, or trajectory). Technical accuracy, workflow metrics and user-reported workload were also recorded.

Results
Ex-vivo study: Mean tumour-overlap improved by 50&#8211;72 % versus MRI alone and by 26&#8211;52 % versus a flat-screen 3-D viewer, irrespective of participant expertise.
Mixed-indication cohort (n = 107): A consecutive study of a mixed case series in neurosurgery (oncology, vascular, functional, skull base). Mixed Reality preparation prompted plan modification in 2.8 % without prolonging theatre preparation.
Posterior-fossa tumours (n = 21): Head position or craniotomy was altered in 3 patients (14 %), contributing to &gt; 90 % resection in 81 % of cases (71 % mRS 0&#8211;1).
Extracranial&#8211;intracranial bypass (n = 10): Skin-incision adaptation occurred in 70 %, shortening arterial mapping and reducing donor-vessel injury to 0 % versus 30 % in historical controls.
Carotid endarterectomy (n = 39): Incision tailoring was noted in 13 patients (33 %), caused by depiction of bifurcation height and plaque length.
MCA aneurysm clipping (n = 25): Lumi altered approach or head orientation in 8 % . The mean skin-to-skin time was less that a matched historical cohort by 26 min (203 &#177; 64 vs 229 &#177; 78 min).
EVD placement phantom trial (n=236): Optimal catheter positioning (Kakarla-1) rose from 37.3 % to 57.6 % and gross misplacement fell from 40.7 % to 21.2 %; distance-to-target and angular error were significantly lower, at the cost of a 6-min median time penalty. NASA-TLX workload and SUS usability scores remained favourable throughout studies.

Conclusions
Across diverse cranial indications, Lumi mixed-reality provides three consistent advantages: (i) objectively superior 3-D anatomical insight, (ii) clinically meaningful modifications of operative strategy in 3&#8211;39 % of cases, and (iii) quantifiable gains in technical accuracy without workflow penalty. The platform thereby showed a low risk profile. Ongoing multicentre trials and structured post-market surveillance will further delineate long-term outcome effects, but current data already position Lumi with its combination of AI and MxR as a valuable adjunct in modern neurosurgery.</description>
                <recording>
                    <license></license>
                    <optout>false</optout>
                </recording>
                <links></links>
                <attachments></attachments>

                <url>https://pretalx.surf.nl/national-xr-day-2025/talk/3WUDPQ/</url>
                <feedback_url>https://pretalx.surf.nl/national-xr-day-2025/talk/3WUDPQ/feedback/</feedback_url>
            </event>
            
        </room>
        <room name='Theil C1-6 (60p)' guid='a7bd6591-f9da-5346-9529-2600e5121cdb'>
            <event guid='ceb7a196-440b-55e9-a25b-928dea433842' id='2523'>
                <room>Theil C1-6 (60p)</room>
                <title>We don&apos;t need very much: Stories of innovation for impact in simulation education from the African Simulation Network</title>
                <subtitle></subtitle>
                <type>Featured presentation</type>
                <date>2025-07-02T11:15:00+02:00</date>
                <start>11:15</start>
                <duration>00:45</duration>
                <abstract>Simulation-based education in healthcare provides learners the opportunity to experience a replication of a healthcare event, and after the event have an opportunity to learn through facilitated reflection in a debrief. Commonly, simulation is used to help healthcare workers prepare for emergencies, but it can also be used, among many other uses, to test healthcare systems, improve patient safety, and help healthcare teams work better together.</abstract>
                <slug>national-xr-day-2025-2523-we-don-t-need-very-much-stories-of-innovation-for-impact-in-simulation-education-from-the-african-simulation-network</slug>
                <track>Impact</track>
                
                <persons>
                    <person id='1057'>Jocelyn Park-Ross</person>
                </persons>
                <language>en</language>
                <description>Much of what we know about and how we practise simulation is based on practices and knowledge created in high-income and high-resource settings, and there is no upper limit on how much can be spent on simulation equipment and this is a barrier to the growth. Within low resource setting, innovation is key to providing high-quality and contextual educational experiences to grow healthcare workers who are equipped through their training to provide high quality healthcare in the African Healthcare systems.</description>
                <recording>
                    <license></license>
                    <optout>false</optout>
                </recording>
                <links></links>
                <attachments></attachments>

                <url>https://pretalx.surf.nl/national-xr-day-2025/talk/3HJWQ3/</url>
                <feedback_url>https://pretalx.surf.nl/national-xr-day-2025/talk/3HJWQ3/feedback/</feedback_url>
            </event>
            <event guid='dbed07a6-210e-5272-a7a5-a3fb6988864e' id='2472'>
                <room>Theil C1-6 (60p)</room>
                <title>3 Years of Experience with Unreal Fluid Dynamics in Higher Education: The Ups and Downs</title>
                <subtitle></subtitle>
                <type>Presentation [20 + 5 Q&amp;A]</type>
                <date>2025-07-02T13:45:00+02:00</date>
                <start>13:45</start>
                <duration>00:25</duration>
                <abstract>As a Civil Engineering lecturer, I developed a 3D VR multiplayer learning environment in Unreal Engine to teach fluid mechanics in Civil Engineering. Hosted on a server at Avans Hogeschool, it has been used in education for three years. I&#8217;ll discuss the original idea, development process, educational impact, challenges, and future plans.</abstract>
                <slug>national-xr-day-2025-2472-3-years-of-experience-with-unreal-fluid-dynamics-in-higher-education-the-ups-and-downs</slug>
                <track>Impact</track>
                
                <persons>
                    <person id='1014'>Henk Massink</person>
                </persons>
                <language>en</language>
                <description>The multiplayer project is build in Unreal Engine, and runs on a server at Avans Hogeschool. I will present a demo.

Topics I want to discuss:
&#8226; The original idea
&#8226; The development process
&#8226; The &quot;summative&quot; use in education (over three years)
&#8226; The ups and downs
&#8226; The future

https://fluiddynamics.itch.io/unreal-fluid-dynamics</description>
                <recording>
                    <license></license>
                    <optout>false</optout>
                </recording>
                <links></links>
                <attachments></attachments>

                <url>https://pretalx.surf.nl/national-xr-day-2025/talk/UD9SBQ/</url>
                <feedback_url>https://pretalx.surf.nl/national-xr-day-2025/talk/UD9SBQ/feedback/</feedback_url>
            </event>
            <event guid='3a25c051-9958-558e-8855-dc95910a6b32' id='2464'>
                <room>Theil C1-6 (60p)</room>
                <title>Mixed Reality meets geospatial data: Exploring a new era of immersive collaborative planning</title>
                <subtitle></subtitle>
                <type>Presentation [20 + 5 Q&amp;A]</type>
                <date>2025-07-02T14:20:00+02:00</date>
                <start>14:20</start>
                <duration>00:25</duration>
                <abstract>Physical maps had a crucial role when planning human activities, we aim to bring this experience into the 21st century by projecting them in a shared space using MR headsets. We tap into a set of GIS data to visualize and interact with data from a variety of sea basins. Using virtual tools to explore how MR can be used for collaboration in the planning of marine environments.</abstract>
                <slug>national-xr-day-2025-2464-mixed-reality-meets-geospatial-data-exploring-a-new-era-of-immersive-collaborative-planning</slug>
                <track>Technology</track>
                
                <persons>
                    <person id='142'>Wilco Boode</person>
                </persons>
                <language>en</language>
                <description>Immerse yourself in the future of collaborative planning as we explore the transformative potential of Mixed Reality (MR) technologies. With recent advancements in technical capabilities and reduced costs, MR is now more accessible than ever, offering unprecedented opportunities to bridge the gap between digital content and physical spaces.
In this session, we delve into our groundbreaking project that digitizes traditional geospatial planning using cutting-edge XR technology. By leveraging publicly available GIS data, we bring the intricate details of a sea basin to life within a shared physical environment. Our goal isn&apos;t just to visualize data; we aim to empower users to collaborate and interact with it in innovative ways, unlocking new possibilities for human activities.
Join us as we navigate the challenges and triumphs of this ambitious endeavor. From user experience design to data management and spatial synchronization, we&apos;ll share our journey and the insights gained from our initial research. Discover how MR can revolutionize collaborative planning and pave the way for more intuitive and effective decision-making processes.
Don&apos;t miss this opportunity to witness the future of geospatial planning and be inspired by the potential of Mixed Reality technologies. Hear more about the application we developed, and gain insight into the results of our first study!</description>
                <recording>
                    <license></license>
                    <optout>false</optout>
                </recording>
                <links></links>
                <attachments></attachments>

                <url>https://pretalx.surf.nl/national-xr-day-2025/talk/YYEM3R/</url>
                <feedback_url>https://pretalx.surf.nl/national-xr-day-2025/talk/YYEM3R/feedback/</feedback_url>
            </event>
            <event guid='e6bbb70c-ea81-53ce-813a-dc192f009676' id='2470'>
                <room>Theil C1-6 (60p)</room>
                <title>Gaussian Splatting in XR: Photoreal 6DoF</title>
                <subtitle></subtitle>
                <type>Birds of a feather</type>
                <date>2025-07-02T14:55:00+02:00</date>
                <start>14:55</start>
                <duration>00:45</duration>
                <abstract>Since its initial release in 2023, gaussian splatting has fundamentally altered the way we can create photoreal 3D scenes for XR experiences. This workshop discusses its applications in an educational context. The evolving technical aspects of splat creation and workflows will be covered in depth, followed by a live demonstration that showcases an ongoing criminology project at Leiden University.</abstract>
                <slug>national-xr-day-2025-2470-gaussian-splatting-in-xr-photoreal-6dof</slug>
                <track>Technology</track>
                
                <persons>
                    <person id='1008'>Nathan Saucier</person>
                </persons>
                <language>en</language>
                <description>3D Gaussian Splatting (3DGS), first released by Inria in 2023, is a novel method of constructing 3D scenes from simple photos or videos. Belonging to the broader category of radiance fields, 3DGS creates photoreal environments that can be quickly rendered in real time, marking a departure from previous methods like NeRF.  

Across 3D applications, this means that scenes or objects may be quickly constructed, providing a new means of rapid prototyping. VR creators have long had to contend with lengthy 3D modeling times in order to achieve results that approach photorealism, or compromise with lower fidelity techniques such as photogrammetry. 3DGS aims to solve these problems with low manhours and highly realistic renderings, making it an attractive solution for XR. 

Nonetheless, the field is quite new and not without a number of stumbling blocks. These can largely be solved through 3DGS-optimized workflows and (of course) by adding significantly more compute capabilities with the use of virtual machines. This workshop covers these aspects in depth, and culminates in a live demonstration of a strong potential use case: urban planning interventions. 

Together with the City of Leiden and the Leiden Law School, the Leiden Learning and Innovation Centre has created a prototype that explores urban safety research in criminology by simulating various interventions using 3DGS scenes of municipal locations. These simulations allow planners and policymakers to &#8220;pre-visualize&#8221; proposed interventions and test them with research participants, providing a model to evaluate efficacy prior to physical implementation.</description>
                <recording>
                    <license></license>
                    <optout>false</optout>
                </recording>
                <links></links>
                <attachments></attachments>

                <url>https://pretalx.surf.nl/national-xr-day-2025/talk/BSTVCH/</url>
                <feedback_url>https://pretalx.surf.nl/national-xr-day-2025/talk/BSTVCH/feedback/</feedback_url>
            </event>
            
        </room>
        <room name='Theil C2-3 (22p)' guid='76befb32-950c-56d3-bf32-a80bba5ef3d6'>
            <event guid='83d0e4e4-c592-5e0e-bc93-034f5bc07a6d' id='2498'>
                <room>Theil C2-3 (22p)</room>
                <title>Finding the Spark: Designing for Resonance and Transformation (workshop)</title>
                <subtitle></subtitle>
                <type>Workshop [60]</type>
                <date>2025-07-02T11:15:00+02:00</date>
                <start>11:15</start>
                <duration>01:00</duration>
                <abstract>Transformation starts with resonance, resonance begins with a spark.
In this workshop, we&#8217;ll explore what makes immersive experiences emotionally impactful. Together we&#8217;ll explore the concept of transformative experiences, compare practical insights, and thereby gain new perspectives on designing XR of the future.</abstract>
                <slug>national-xr-day-2025-2498-finding-the-spark-designing-for-resonance-and-transformation-workshop</slug>
                <track>Impact</track>
                
                <persons>
                    <person id='1043'>Nick Degens</person><person id='1036'>Christian Roth</person><person id='1037'>Joanneke Weerdmeester</person>
                </persons>
                <language>en</language>
                <description>What makes artistic immersive experiences emotionally impactful? In this workshop, we will invite participants to use our Transformative Spark framework to explore different XR-works and identify moments that help create a personal &#8216;spark&#8217;.
By reflecting and comparing practical experiences with others, we will uncover patterns in how resonance and meaning emerges from interactivity.
Rather than focusing solely on theory, this hands-on workshop invites participants to step into the role of both interactor, methodologist, and designer. Through experience, guided discussion and analysis participants will engage deeply with XR asking: What resonates with them and why? How do their insights and experiences compare to others? 
Using the Transformative Spark framework, we will sharpen our understanding about how design choices and interactions shape meaningful and transformative artistic experiences. By the end, participants will leave with new perspectives on designing for resonance and impact and the diverse ways people connect with XR and themselves.</description>
                <recording>
                    <license></license>
                    <optout>false</optout>
                </recording>
                <links></links>
                <attachments></attachments>

                <url>https://pretalx.surf.nl/national-xr-day-2025/talk/CZBNVP/</url>
                <feedback_url>https://pretalx.surf.nl/national-xr-day-2025/talk/CZBNVP/feedback/</feedback_url>
            </event>
            <event guid='9d01a13f-5288-54cc-991d-5b0fc564933d' id='2476'>
                <room>Theil C2-3 (22p)</room>
                <title>How to build a Dutch street in VR (workshop)</title>
                <subtitle></subtitle>
                <type>Workshop [45]</type>
                <date>2025-07-02T13:45:00+02:00</date>
                <start>13:45</start>
                <duration>00:45</duration>
                <abstract>In this workshop we will explore the technical challenges of building a large scale, Dutch urban virtual urban environment in Unity, which currently does not exist. We will discuss our approach to making development efficient through scalable modularity, building custom editor tools, populating the environment with virtual humans, and more.</abstract>
                <slug>national-xr-day-2025-2476-how-to-build-a-dutch-street-in-vr-workshop</slug>
                <track>Technology</track>
                
                <persons>
                    <person id='1018'>Timon Verduijn</person><person id='1151'>Thomas Ginn</person>
                </persons>
                <language>en</language>
                <description>It is difficult to perform research in complex, urban environments such as a shopping street, because they cannot be controlled. But in order for research results to be valid, experiments must be done in real-world scenarios. To solve this challenge, a virtual world is needed. 

In this workshop we will explore the technical challenges of building a large scale, Dutch urban virtual urban environment in Unity, which currently does not exist. We will discuss our approach to making development efficient through scalable modularity, building custom editor tools, populating the environment with virtual humans, and more.

This workshop is hosted by the Wander XR Experience Lab at Wageningen University &amp; Research and will focus on one of our current projects: the Immersive Virtual Reality (IVR) Food Environment. 

We will begin with a brief introduction to the context and background of the project, explaining the motivation behind creating a virtual shopping street. 

Following this, we will dive into the technical challenges encountered during development. Through a step-by-step breakdown, we will showcase the technical and artistic decisions we made to overcome these challenges. 

This session is ideal for developers, researchers, and designers interested in building large-scale virtual environments, procedural content generation, and interactive urban simulations. Attendees will gain insights into practical solutions for optimizing workflows, enhancing interactivity, and managing complexity when building VR environments.</description>
                <recording>
                    <license></license>
                    <optout>false</optout>
                </recording>
                <links></links>
                <attachments></attachments>

                <url>https://pretalx.surf.nl/national-xr-day-2025/talk/MKAMZF/</url>
                <feedback_url>https://pretalx.surf.nl/national-xr-day-2025/talk/MKAMZF/feedback/</feedback_url>
            </event>
            <event guid='f8ec9557-a688-5fe1-a44d-78b7b5590e6c' id='2525'>
                <room>Theil C2-3 (22p)</room>
                <title>&#127475;&#127473; Lesgeven met VR - Ervaar het zelf! (workshop)</title>
                <subtitle></subtitle>
                <type>Workshop [45]</type>
                <date>2025-07-02T14:55:00+02:00</date>
                <start>14:55</start>
                <duration>00:45</duration>
                <abstract>Note: this session will be in Dutch!

Discover how the Dutch Police Academy uses VR in education.</abstract>
                <slug>national-xr-day-2025-2525-lesgeven-met-vr-ervaar-het-zelf-workshop</slug>
                <track>Impact</track>
                
                <persons>
                    <person id='1053'>Giny Verschoor</person><person id='1100'>Bart Sanders</person>
                </persons>
                <language>en</language>
                <description>Curious about the possibilities of virtual reality (VR) in education? In this interactive workshop by the Dutch Police Academy, you&#8217;ll literally step into the world of VR and experience what it&#8217;s like to teach and learn in a virtual environment.

We&#8217;ll begin with a brief introduction on how we use VR within police education. You&#8217;ll get insight into our development approach, the training goals we aim to achieve, and practical examples of how VR is integrated into learning scenarios. Then it&#8217;s time to dive in.

Put on the VR headset and take part in a short VR lesson, just as a student or trainee would. Police academy instructors will guide the session, so you&#8217;re not just observing &#8211; you&#8217;re actively participating. You&apos;ll experience firsthand how VR can increase engagement, stimulate reflection, and enhance interaction.

After the VR experience, we&#8217;ll reflect together. What was it like to be in the middle of a VR training? How did it affect your perspective on teaching? How can VR help spark meaningful conversations, address behavior, or offer new ways to achieve learning objectives?

This workshop is hands-on and&#8211; ideal for anyone interested in exploring innovative teaching methods, no matter your level of VR experience.

Note: Active participation is expected &#8211; yes, you really will be putting on the VR headset!</description>
                <recording>
                    <license></license>
                    <optout>false</optout>
                </recording>
                <links></links>
                <attachments></attachments>

                <url>https://pretalx.surf.nl/national-xr-day-2025/talk/CUUQJM/</url>
                <feedback_url>https://pretalx.surf.nl/national-xr-day-2025/talk/CUUQJM/feedback/</feedback_url>
            </event>
            
        </room>
        <room name='Theil C2-4 (22p)' guid='b8d9c131-fc97-5bfe-ae27-50b130a74a4f'>
            <event guid='9c2add38-9334-5393-89dc-3b45ecf1c060' id='2457'>
                <room>Theil C2-4 (22p)</room>
                <title>Looking for Togetherness: Designing and Developing Immersive Experiences for Connectedness and Belonging (workshop)</title>
                <subtitle></subtitle>
                <type>Workshop [60]</type>
                <date>2025-07-02T11:15:00+02:00</date>
                <start>11:15</start>
                <duration>01:00</duration>
                <abstract>This session explores how immersive experiences can foster connectedness and belonging in an increasingly individualized digital era. We will share insights from our research, engage participants in a fun, interactive activity that embodies togetherness, and collaboratively brainstorm how to design XR applications that enhance collectivity.</abstract>
                <slug>national-xr-day-2025-2457-looking-for-togetherness-designing-and-developing-immersive-experiences-for-connectedness-and-belonging-workshop</slug>
                <track>Impact</track>
                
                <persons>
                    <person id='670'>Mark Jacobs</person><person id='393'>Pawan Bhansing</person><person id='180'>Shant Bayramian</person>
                </persons>
                <language>en</language>
                <description>Virtual Reality (VR) offers rich, immersive environments but often emphasizes individual experiences, potentially exacerbating loneliness and social fragmentation. Our research explores how VR can be designed to foster collective engagement and social connectedness by prioritizing public values such as inclusivity, empathy, and collaboration. We collaborate with industry professionals in the arts and culture sector, integrate findings into education, and co-create XR products with students. 

This session invites participants to share insights, challenges, and ideas on leveraging XR for togetherness. We will present findings from phase one of our research&#8212;a literature review identifying key challenges and opportunities in balancing individuality with collectivity. Insights from our collaboration with Another Kind of Blue (AKOB), a dance company pioneering XR performance integration, will also be shared. 

To ensure togetherness is explored in the session, we will facilitate an unplugged, low-tech interactive version fostering collaboration and connection without digital tools.

This approach will make collectivity tangible and spark discussion on integrating these principles into XR design. Attendees will brainstorm strategies for fostering belonging in both general and virtual contexts, generating insights that inform future research and XR development.

We will also discuss efforts to translate research insights into educational curricula, including workshops and interdisciplinary projects at Inholland and Amsterdam University of Applied Sciences. By engaging students and industry partners in co-creating XR experiences, we aim to refine human-centered XR design methodologies. 

The session will conclude with a reflection on best practices for developing immersive experiences that cultivate a sense of belonging.</description>
                <recording>
                    <license></license>
                    <optout>false</optout>
                </recording>
                <links></links>
                <attachments></attachments>

                <url>https://pretalx.surf.nl/national-xr-day-2025/talk/JLZNXN/</url>
                <feedback_url>https://pretalx.surf.nl/national-xr-day-2025/talk/JLZNXN/feedback/</feedback_url>
            </event>
            <event guid='9e6b8ba0-b478-5d9a-83cd-bf7b581a58e0' id='2473'>
                <room>Theil C2-4 (22p)</room>
                <title>Play and Build Your Way to Safer XR with Polder Perspectives and XRSI (workshop)</title>
                <subtitle></subtitle>
                <type>Workshop [60]</type>
                <date>2025-07-02T13:45:00+02:00</date>
                <start>13:45</start>
                <duration>01:00</duration>
                <abstract>How can we talk about serious topics of safety and ethics with XR tech in a fun way? Let&#8217;s Play &quot;Polder Perspectives XR&quot;, a workshop game designed to explore ethical design and use with XR technologies. Team up as fictional XR companies, work alongside the XRSI Europe team and tackle real-world assignments. Learn through experience responsible XR and XR safety in a fun, collaborative session!</abstract>
                <slug>national-xr-day-2025-2473-play-and-build-your-way-to-safer-xr-with-polder-perspectives-and-xrsi-workshop</slug>
                <track>Collaboration</track>
                
                <persons>
                    <person id='1019'>Valentino Megale</person><person id='188'>John Walker</person><person id='1015'>Alina Kadlubsky</person>
                </persons>
                <language>en</language>
                <description>Join this workshop game session to experience how through play, we can illuminate critical ethical, safety, and public-value considerations in XR technology for education. In &quot;Polder Perspectives XR,&quot; participants form teams acting as fictional XR companies tasked with designing and pitching educational XR solutions. Supported by our guests from XRSI Europe, teams will navigate challenging ethical scenarios and identify practical approaches to responsible XR development.
During this special session you&#8217;ll get to work with the XRSI team! 

&#8226; Learn about Polder Perspectives XR and see how you can play it in your own institution or organisation. Maybe you&#8217;ll win a deck to take home? 
&#8226; Play with nuanced risks like biometric data handling, psychological safety, avatar identity protection, and &#8216;inclusive&#8217; access.
&#8226; Realise best practices, regulatory compliance, and ethical mitigation strategies to ensure safe and responsible XR.</description>
                <recording>
                    <license></license>
                    <optout>false</optout>
                </recording>
                <links></links>
                <attachments></attachments>

                <url>https://pretalx.surf.nl/national-xr-day-2025/talk/CNCRPG/</url>
                <feedback_url>https://pretalx.surf.nl/national-xr-day-2025/talk/CNCRPG/feedback/</feedback_url>
            </event>
            <event guid='06a2caa2-2c77-5e9a-b934-590e4706d667' id='2480'>
                <room>Theil C2-4 (22p)</room>
                <title>Co-Located XR Design Education Laboratories (workshop)</title>
                <subtitle></subtitle>
                <type>Workshop [45]</type>
                <date>2025-07-02T14:55:00+02:00</date>
                <start>14:55</start>
                <duration>00:45</duration>
                <abstract>We present and demonstrate the early phases of a virtually co-located collaboration between HKU Utrecht&apos;s Artistic XR Lab and Design Academy Eindhoven&apos;s Trans Realities Lab. We are exploring the potential for XR-mediated colocation to deliver new interdisciplinary models for educational art and design collaboration across geographic and institutional boundaries.</abstract>
                <slug>national-xr-day-2025-2480-co-located-xr-design-education-laboratories-workshop</slug>
                <track>Collaboration</track>
                
                <persons>
                    <person id='191'>Joris Weijdom</person><person id='207'>Professsor Dr. Ian Biscoe</person>
                </persons>
                <language>en</language>
                <description>This dynamic presentation showcases the early phases of an innovative collaboration between HKU University of the Arts Utrecht&apos;s Artistic Extended Reality Lab and Design Academy Eindhoven&apos;s Trans Realities Lab. 
Our session highlights how educational institutions might meaningfully connect physical and virtual laboratory spaces through XR technologies for art and design research and education, demonstrating a pioneering approach to collaborative design education. Building on the Trans Realities Lab&apos;s established eXtended Reality &#8211; Intraverse Toolkit (XR-IT), we are testing integration of a hybrid sketching environment based on the Social VR platform Resonite.
We showcase an embodied multi-user workflow between Utrecht and Eindhoven as interdisciplinary teams design &quot;The Hybrid Design Studio of the Future.&quot; Unpacking how art and design students and researchers collaboratively utilize MoCap, 3D scanning, and virtual object manipulation to ideate, design, and expand spatial narratives based on the laboratories&#8217; physical limitations and the boundaryless virtual environment.

Key takeaways:
&#8226;	Practical insights into connecting physical labs through XR technologies
&#8226;	Emerging framework implementing cross-institutional collaborative design environments
&#8226;	Embodied interdisciplinary collaborative design strategies within hybrid physical/virtual spaces
&#8226;	Technical considerations for synchronizing inputs/outputs across locations
&#8226;	Discussion of future development of interdisciplinary art and design collaborations

As educational institutions explore new collaboration modalities, this presentation provides a concrete example of how networked XR transforms design education by creating shared spaces that transcend physical limitations while maintaining tactile qualities of studio-based experiential learning. We see this as an opportunity to engage with the XR community at the early stages of our collaboration, intending to return to National XR day in 2026 for a progress report.</description>
                <recording>
                    <license></license>
                    <optout>false</optout>
                </recording>
                <links></links>
                <attachments></attachments>

                <url>https://pretalx.surf.nl/national-xr-day-2025/talk/PZEK3D/</url>
                <feedback_url>https://pretalx.surf.nl/national-xr-day-2025/talk/PZEK3D/feedback/</feedback_url>
            </event>
            
        </room>
        
    </day>
    
</schedule>
