Evidence-informed use of learning analytics (ENG)
12-11-2025 , Foyer 3

The use of EdTech and learning analytics is growing rapidly, but how do you know if they really contribute to better education? In this interactive session, we explore how evaluation helps with evidence-informed implementation of technology in education. Using concrete examples and joint discussion, you will work together with other participants on case studies, share experiences and formulate insights. The aim is to strengthen institutions in building a learning, research-based culture of innovation. Please note: we ask you to bring a use case to the session to discuss in groups. The use case is about an application of technology/learning analytics in education.


The rise of learning analytics, AI tools and EdTech in education is undeniable. Within secondary vocational education, higher professional education and university education, technology is increasingly being used with the aim of improving learning, teaching and organisation. Yet crucial questions remain: how do you know if these applications are truly effective? How do you ensure that you implement these tools in an evidence-informed manner? How do we create a learning culture together and promote joint learning and innovation?

This session ties in with the growing awareness that an evidence-informed approach is essential for educational innovation. Evaluation plays a key role in this. At the same time, it is not easy in practice to design evaluations in a meaningful and feasible way, especially when stakeholders with different interests and perspectives work together.

In this interactive workshop, we explore how evaluation of EdTech and learning analytics applications can contribute to better and fairer education. We show how evaluation can play a role not only retrospectively, but also during the development and implementation process. By working together on case studies and practical examples, we explore how institutions and EdTech suppliers can build a shared evaluation culture.

The session is intended for a wide audience: education professionals, policy advisors, researchers, developers, IT professionals and project leaders involved in the use of technology in education. Whether you work at an educational institution or an EdTech company, this session offers new insights, tools and opportunities for collaboration. During this themed session, we will delve into the power of evaluations. We will show you why evaluation is indispensable, how to approach it in a structured way, and how you are already (or could be) working in an evidence-informed manner.

Participants will leave the session with:

  • concrete examples of evidence-informed implementation;

  • insight into evaluating technology in their own practice;

  • inspiration to approach evaluation as an integral part of innovation.

Short practical examples

We’ll start the session with three short pitches in which practical experts share how they shape evidence-informed work in their organisations:

  • Anouschka van Leeuwen (Utrecht University) presents a roadmap for designing and evaluating learning analytics interventions.

  • Nils Siemens (Amsterdam University of Applied Sciences) talks about the approaches at the AUAS that enable and supervise experiments in faculties.
  • Manika Garg (The Hague University of Applied Sciences) introduces the Dutch ‘3E framework’ (Evidence-informed Evaluation of EdTech), which helps institutions and suppliers make shared decisions about technology use. She translates this into application within learning analytics.

Interactive case discussion in groups

The participants form small groups. Each group chooses a case from their own practice: for example, a learning analytics implementation or an EdTech intervention within an institution or company. In your group you discus which solution has been investigated, how it has been approached so far, and how the effects have (or have not) been evaluated. Questions such as the following are central to this discussion: what worked? What was difficult? And how could it be improved? The group discusses this process and the opportunities and pitfalls they have experienced.

Plenary feedback and reflection

In the plenary closing session, each group shares its most important insight or question. We reflect together on what is needed to grow towards a sustainable evaluation culture within institutions.

Senior Researcher at The Hague University of Applied Sciences
PhD in Education Technology

I am the coordinator of the LA data platform/data steward of the educational domain.
Our LA team focusses on implementing LA at the UU for which we developed an LA policy, a data platform, and an LA roadmap. The LA roadmap is used to ensure that each LA project complies with the LA policy and the AVG and more recently, the AI-act. Information about the policy, the roadmap, and the LA projects we are involved in can be found on our website: https://www.uu.nl/en/education/learning-analytics.