Leuven AI banner

Call for Posters

The ELA AI Triangle (EAISI, Leuven.AI, and RWTH AI Center) Workshop series on Explainable AI (XAI) will kick off on May 23rd at TU Eindhoven. The first workshop in the series will focus on theories, methods, and applications of (neuro-) symbolic AI that enable and/or leverage explainability. However, all researchers interested in this and other XAI sub-areas are welcome to attend and contribute. Thanks to sponsorship of TAILOR and CLAIRE, participation in the workshop is free. Come to TU Eindhoven to hear from invited speakers, to participate in a design challenge, to contribute a poster presentation and community building!

We invite researchers at all levels, including MSc thesis students, PhD candidates, postdocs and more established researchers working on XAI to present their recent work at one of the two poster sessions. There is no requirement for presenters to be affiliated with KU Leuven, TU/e, or RWTH Aachen. We especially encourage members of TAILOR and CLAIRE to contribute. You can bring your printed poster (A0 or A1 size recommended) with you or request it to be printed on-site free of charge. A link to upload your poster will be provided on 17/05/2024.

Workshop Program

Start Time End Time Venue Title Additional Description
9:15 10:15 Blauwe Zaal Networking Breakfast Join us early to get to know each other while enjoying provided breakfast and coffee
10:15 10:30 Auditorium 6 Welcome Short introduction of the three centers and the workshop by the organizers
10:30 11:00 Auditorium 6 Invited Talk Devendra Dhami: Shedding Light on the Black Box: Explainable AI in Diverse Machine Learning Landscapes
11:00 11:30 Auditorium 6 Invited Talk Joost Vennekens: Explainability and Symbolic AI: Observations from two Applications
11:30 12:30 Auditorium 6 Keynote Talk Vaishak Belle: Loss functions with symbolic constraints in deep learning
12:45 14:15 Next to Blauwe Zaal Poster Session and walking lunch See section Posters
14:15 16:00 Auditorium 15 Afternoon challenge See section Challenge
16:00 16:15 Auditorium 6 Presentation results Presentations of challenge results and closing
16:15 17:00 Next to Blauwe Zaal Closing Notes

Keynote Speaker

Vaishak Belle 2019

Vaishak Belle

Dr Vaishak Belle (he/him) is Reader at the University of Edinburgh, an Alan Turing Fellow, and a Royal Society University Research Fellow. He has made a career out of doing research on the science and technology of AI. He has published close to 100 peer-reviewed articles, won best paper awards, and consulted with banks on explainability. As PI and CoI, he has secured a grant income of close to 8 million pounds.

Posters

  • Alexander Liu (TU/e) – SpaCE-VAE: Sparse Confident Explanations using Variational Autoencoders
  • Anniek Jansen (TU/e) – Personalization of child-robot interaction through reinforcement learning and user classification
  • Antonia Wüst (TU Darmstadt) – Pix2Code: Learning to Compose Neural Visual Concepts as Programs
  • Carlos Zednik (TU/e) – Does XAI need Cognitive Models?
  • Céline Budding (TU/e) – What do large language models know? Tacit knowledge as a potential causal-explanatory structure
  • Christian Fleiner (KU Leuven) – Preprocessing the Knowledge Base: Managing Verbal Probability Expressions
  • Duo Yang (KU Leuven) – TorchicTab: Semantic Table Annotationwith Wikidata and Language Models
  • Fahad Sarfraz (TU/e) – Beyond Unimodal Learning: The Importance of Integrating Multiple Modalities for Lifelong Learning
  • François Leborgne (TU/e) – Contextualizing the "Why": The Potential of Using Visual Map As a Novel XAI Method for Users with Low AI-literacy
  • Jo Devriendt (Nonfiction Software) – ManyWorlds: an accessible combinatorial language with dual explanation support
  • Keivan Shariatmadar (KU Leuven) – A generalisation of Bellman Equation in Epistemic Reinforcement Learning
  • Konstantinos Tsiakas (TU Delft) – Explainable AI to Support Self-Regulated Learning
  • Kylian Van Dessel (KU Leuven) – a First Order Logic conflict explanation framework
  • Lucas Van Laer (KU Leuven) – SLI: a reasoning engine for FO(.)
  • Marjolein Deryck (KU Leuven) – Combining expert knowledge and process mining to create a recommendation system
  • Robin De Vogelaere (KU Leuven) – KeBAP: Knowledge-Base API for Python
  • Soroush Ghandi (TU/e) – Probabilistic Circuits with Constraints
  • Till Hofmann (RWTH) – Using Platform Models for a Guided Explanatory Diagnosis Generation for Mobile Robots
  • Zlatan Ajanovic (RWTH) – Planning, Learning and Control for Hybrid Discrete-Continous Robotic Problems

Design Challenge

In the afternoon, you can join the following design challenge.

Explainability and IDP-Z3

IDP-Z3 is a reasoning engine for domain knowledge using a formal specification and can perform many useful inference tasks to drive intelligent behavior and to support users in decision making. Outcomes are currently explained by listing related constraints. Non-expert users might struggle to understand explanations resulting from larger knowledge bases. The objective of the design challenge is to improve the quality of explainability by modifying the knowledge base, improving the user interface, or using other AI models to add additional explainability. A short tutorial to IDP-Z3 will be given in the first hour. The challenge will be done in teams and the developed solutions must be presented at the end. Solutions can be anything presentable like flow charts, mock-ups, or working code. A laptop is required to attend this challenge.

Venue

The workshop takes place at TU Eindhoven in Auditorium 6 at 5612 AZ Eindhoven, The Netherlands.

By Car. Please read the article about Parking on campus.

By Train. After arriving at Eindhoven Centraal, the venue is reachable within 15 minutes by foot.

Organizations

EAISI Healtcare

TU/e EAISI

EAISI aims to connect AI researchers within TU/e, with industrial partners and help fund their projects by obtaining grants and subsidies as well as industry funding.

Leuven.AI

Leuven.AI

The Leuven.AI initiative started up in 2018. Its purpose is to bring together the wealth of expertise on AI that exists at KU Leuven, spread over multiple faculties and departments, and provide a single access point to it.

RWTH Aachen AI Center

RWTH AI Center

The RWTH Center for Artificial Intelligence has been founded with the goal to strengthen and to focus the research activities in Artificial Intelligence and Machine Learning at RWTH Aachen University.

TAILOR

TAILOR

TAILOR is an EU project with the aim build the capacity to provide the scientific foundations for Trustworthy AI in Europe. TAILOR develops a network of research excellence centres, leveraging and combining learning, optimisation, and reasoning (LOR) with the key concepts of Trustworthy AI (TAI).

CLAIRE

CLAIRE

The Confederation of Laboratories for Artificial Intelligence Research in Europe (CLAIRE) is an international non-profit association (AISBL) created by the European AI community that seeks to strengthen European excellence in AI research and innovation, with a strong focus on human-centred AI.

Organizing Committee

Leuven.AI

Leuven.AI

Christian Fleiner
Marjolein Deryck
Joost Vennekens