TULIP: A Prototype for Open, Locally Hosted LLM Infrastructure
2025-12-04 , Progress

Large Language Models are becoming core research tools, yet dependence on commercial APIs raises issues of privacy, compliance, and long-term cost. At TU Delft, REIT and ICT are prototyping TULIP, a Kubernetes-based platform for locally hosted open LLMs. We’ll share design choices that prioritize responsible innovation: containerized serving with an OpenAI-compatible API, cluster-native scaling, and transparent monitoring.

While national initiatives like SURF’s WiLLMa focus on shared capacity, TULIP explores the campus-level space: providing researchers with reproducible endpoints, model governance, and early feasibility metrics for institutional hosting. We will share early lessons, governance implications, and practical guidance for universities and labs aiming to offer sustainable, open alternatives to proprietary AI services.


TULIP is TU Delft’s prototype for open, locally hosted LLM infrastructure. This session will highlight: - Why a local pilot like TULIP matters for researcher engagement - How it complements WiLLMa, SURF’s AI hub initiative - Early lessons on balancing technical feasibility with governance and sustainability - Open discussion on how institutional platforms can provide sustainable, open alternatives to proprietary AI services The session is aimed at researchers, research engineers, and infrastructure managers curious about first steps in hosting open LLMs on institutional hardware