2025-07-02 –, Demo area (Theil building hallway)
A selection of XR demonstrations in the central demo area of the Theil building. Click on the name of a demo for more information.
De virtuele wereld: Cyberpunk of Utopie?
Experience the virtual city of Rotterdam in VR! In this demo, the XR-lab of Rotterdam University of Applied Sciences provides a suggestive representation of the virtual city of the future, in which the user takes part in conversations with (AI-driven) residents. The XR-lab of CMI (Institute for Communication, Media and Information Technology) developed this demo in collaboration with the City of Rotterdam and uses it as a template for various experiments and applications. Take part and have yourself 3D scanned and take a walk through the virtual city of the future. What opportunities does AI offer in XR and how do we use this responsibly? How do you think technology should work and what possible applications do you think of?
🇳🇱 Ervaar de virtuele stad Rotterdam in VR! In deze demo geeft het XR-lab van Hogeschool Rotterdam een suggestieve weergave van de virtuele stad van de toekomst, waarin de gebruiker het gesprek aan gaat met (AI-gedreven) inwoners. Het XR-lab van CMI (Instituut voor Communicatie, Media en Informatietechnologie) heeft deze demo ontwikkeld in samenwerking met Gemeente Rotterdam en zet het in als template voor diverse experimenten en toepassingen. Neem zelf deel en laat je 3D scannen en loop zelf door de virtuele stad van de toekomst. Welke kansen biedt AI in XR en hoe zetten we dit verantwoord in? Hoe moet technologie volgens jou werken en aan welke mogelijke toepassingen denk jij?
ECDA - Digiderius and his shadow
Virtual humans are growing in popularity, becoming more lifelike, convincing and easier to interact with. They can be seen as a convenient way to get an answer to a specific question or an explanation about a complex topic, acting as a tutor, guide or friendly face to talk to. However, their ingenuity and convenience often makes us forget a very important question; at what cost?
VR applications for radiology and operating room
🇳🇱 Geïnspireerd door Erik of het klein insectenboek kun je je laten onderdompelen in de Rotterdamse stadsnatuur. Data verzameld door (amateur)biologen wordt gevisualiseerd in deze immersieve installatie. Stap in het schilderij dat natuur heet en zie waar normaal geen oog voor is.
Een collaboratie van ECDA en het GLR.
Interactive Annotation and Automatic Feature Boundary Segmentation of Point Clouds in VR
Using automatic boundary detection and segmentation in point cloud data to support user-driven annotations in interactive virtual reality to streamline the annotation process and enhance precision, efficiency, and intuitiveness in 3D data interpretation and interaction.
Point clouds provide a way to visualize 3D data by combining high detail and scalability, which is amplified when viewed in virtual reality (VR), where the user gains spatial awareness and can perceive depth more accurately. Being able to detect spatial relationships, as well as navigate through them, can provide the user with a better and immersive insight into their data and give them new perspectives.
Recognizing these advantages, industries increasingly use point clouds to digitize real-world spaces and objects, frequently requiring manual (or semi-manual) annotations for training machine learning models. This project aims to develop a VR-based system to support and assist user-driven annotations of large point clouds in VR. By detecting and segmenting features in the cloud, the system seeks to streamline the annotation process, thus enhancing efficiency, precision, and consistency.
Additionally, to add to the intuitiveness and ease of use, the project leverages the potential of VR to make use of optional gesture- and voice-based commands to traverse and interact with the point cloud in an accessible and user-friendly manner.
Lumi: AI-gedreven Augmented Reality software voor de Chirurgie van Vandaag en Morgen
🇳🇱 Ontdek hoe Lumi de chirurgie innoveert. Tijdens een demo beleef je hoe onze medische software 2D-beelden omzet in intuïtieve 3D-hologrammen met behulp van een krachtige set Artificial Intelligence algoritmes die anatomische structuren automatisch herkennen. De 3D-hologrammen worden direct in de echte wereld geprojecteerd, voor hogere precisie, minder vermijdbare fouten en aantoonbaar verbeterde chirurgische uitkomsten.
Lumi wordt gebruikt voor de voorbereiding van operaties, training van chirurgen in opleiding, patiëntenvoorlichting, én, in de nieuwste versie, live ondersteuning tijdens de operatie met Augmented Reality in de OK.
MYOS Make-Your-Own-Street is a user-friendly, UE-based tool developed at TU Eindhoven’s UBeX lab for fast procedural generation of customizable 3D urban streetscapes. Built on a growing library of standardized Dutch urban elements, MYOS enables researchers and students to explore citizen preferences and urban scenarios in VR without manual modeling or programming, promoting participatory design.
Researchers and students in Urban Planning and Design can benefit greatly from the ability to quickly generate and modify 3D digital VR-environments that replicate real-world urban streetscapes. These environments are e.g. useful to assess the impact of various urban elements on the residents' behaviour, wellbeing and perceptions.
However, creating and adjusting the environments manually is a time-consuming process. Even simple modifications, like changing the width of the road or altering the type of buildings, become tedious when they require manually repositioning numerous other elements. At UBeX Urban Behaviour eXtended reality lab of TU Eindhoven, we offer a solution. Our UE-based model, Make-Your-Own-Street (MYOS) allows for the procedural generation of realistic and customizable urban streetscapes tailored to specific study or research objective.
MYOS is built around a growing library of standardized 3D components that represent Dutch urban elements - such as roads, sidewalks, street furniture, vegetation, and buildings. These components, along with the procedural code behind them, allow users to easily replicate real-world scenarios or experiment with hypothetical urban settings. Through a simple interface, users can select elements to include in the desired environment, while backend procedurals automatically assemble the streetscape. MYOS requires no programming knowledge to use, and can be extended with minimal coding effort.
At TU Eindhoven, MYOS serves as a participatory design tool for students and researchers in urban systems and real estate. MYOS enables these users to actively explore and address citizen preferences across a wide variety of applications - without the intensive effort of designing and modelling an urban environment from scratch. During the demonstration, we will showcase MYOS in educational settings and allow visitors to interact with the tool by creating and adjusting their own streetscapes.
New Media centre - XRZone - TU Delft
Demo booth with various demo’s of the New Media Centre’s XRZone of the TU Delft. We will show what we do and which project we are working on. Some of the main demo’s are: XRScale kit library, XR tutorial level, AI classroom(fontus/HAN) and more.
Demo booth with various demo’s of the New Media Centre’s XRZone of the TU Delft. We will show what we do and which project we are working on. Including:
- XRScale kit - library
- XR Tutorial level
- AI Classroom Fontys/HAN developed by TUD
- Geoscience safety protocol
- and more…
🇳🇱 Npuls stimuleert samenwerking tussen onderwijsinstellingen om samen te onderzoeken hoe XR-technologie waardevol kan zijn voor het onderwijs. We verkennen en realiseren de randvoorwaarden om XR goed in te zetten voor doorlopend het beste onderwijs.
Een van de projecten is het XR-framework en de bijbehorende toolbox: praktische tools voor wie met XR aan de slag wil, maar niet weet waar te beginnen. Denk aan stappenplannen, checklists en formats. Zo maken we XR concreet en haalbaar. Kom langs en ontdek hoe jij XR kunt inzetten.
Prototyping with XR & gaming technologies
The Saxion XR Lab will present a variety of prototypes, including a sandbox VR operating room designed for anesthesiology training, a immersive VR experience that simulates the feeling of burnout, and an innovative VR Gaussian Splatting prototype.
The Human Pour: Human(e) Interaction and Making
Step behind the bar in our interactive VR game. Mix drinks and serve them to the right guests—easy, right? Or maybe not… Beneath the surface lies an experience that quietly gets you thinking about who—and what—we consider “human.” Representation in VR goes far beyond visual likeness. It reflects what we value as a society and how we treat other persons humanely. Whether everyone feels welcome and recognized in digital worlds matters more than you might think.
Stap achter de bar in onze interactieve VR-game. Serveer drankjes aan de gasten – makkelijk, toch? Of misschien toch niet… Onder de oppervlakte schuilt een ervaring die je onbewust laat nadenken over wie – en wat – we als menselijk beschouwen. Representatie in VR gaat verder dan hoe iets eruitziet. Het draait om hoe we met anderen omgaan. Hoe mensen worden weergegeven in digitale werelden doet ertoe – het beïnvloedt hoe we onszelf en anderen benaderen. Als we daar fouten in maken of bepaalde vooroordelen meenemen, kunnen we onbewust stereotypen in stand houden, mensen buitensluiten, of juist alles op elkaar laten lijken.
Try a Device! - SURF XR Developer Network
Organized by the SURF XR Developer Network this session provides visitors to National XR Day the possibility to try and experience a variety of XR devices.
Note: this session is only available during the lunch slot (1215 - 1345)
We will have present:
- Varjo XR-4: a high-resolution wired VR and MR headset
- Lynx R-1: designed in France, the Lynx R-1 is one of the few standalone VR/MR headsets that is built as an open platform, and is not based on a data-selling business model. It is somewhat older by now (design from 2021), but a Lynx R-2 based on AndroidXR is being developed.
- Play For Dream MR: a standalone XR headset from China with some impressive specs
- Magic Leap 2: Mixed Reality glasses, in features very similar to HoloLens 2
- Snap Spectacles: standalone AR glasses from Snap, which allow for all kinds of so-called "Lenses" (Snap's name for AR apps and experiences)
- Even Realities G1: standalone smartglasses with built-in single-color display, with AI features such as real-time automatic translation of audio
(Many thanks to of our contacts from the SURF community for providing some of the devices)
eXtended Realities - Intraverse Toolkit (XR-IT) - live demo of high-fidelity Networked XR
Live networked demonstration of XR-IT (the eXtended Realities - Intraverse Toolkit) showing how multiple physical locations can easily collaborate in high-fidelity networked XR scenarios, including the use of motion capture, games engines, HMDs, and low latency audio/video streaming across campuses.
XR-IT enables remote engagement in complex XR environments through the use of a distributed computing model that supports local instancing of games engines (Unreal, and Unity), and low-latency geographic distribution of mocap data, audio/video streams, camera parameters and facial capture data, using a secure global virtual LAN model of connectivity. Beyond the issue of connecting distributed production facilities, is the challenge of coalescing physical and virtual spaces across multiple nodes of engagement into a seamless hybrid environment where all partners can work with common spatial references and orientations. XR-IT supports such scenarios through the provision of automated tools enabling spatial coalescence within new plugins for games engines.
XR-IT was developed in the Trans Realties Lab (TRL) at Design Academy Eindhoven (DAE) with financial support of the European Union’s European Media and Immersion Lab programme and Horizon Europe.