The DIG team is part of Télécom Paris, a member of Institut Polytechnique de Paris, France. We work on knowledge graphs, language models, foundational models, learning over tabular data, graph representation learning, graph mining, and stream mining. The team develops methods for representing, integrating, and reasoning over complex, dynamic data to enable interpretable and trustworthy AI. Applications range from general-purpose AI to domain-specific areas such as healthcare and law.
News
Apr 01, 2026: Islem Joining DIG as a PhD Student
Islem is joining the DIG team for a PhD to work on verifying the outputs of language models with the help of knowledge bases.
Short Bio: Islem is a machine learning researcher with experience working on LLMs, code optimization, and applied AI systems. He is a graduate of Université Paris Cité (M2 Machine Learning), with research experience at NYU Abu Dhabi and Rakuten Tech Europe, and publications at WWW25 and PACT25. His recent projects focused on LLM-guided compiler optimization and recommender systems.
Islem will be advised by Yanzhu Guo and Fabian Suchanek, and is based in office 4C20.
Welcome, Islem!
Mar 16, 2026: Welcome Chenwei Wan to the DIG Team
We’re happy to welcome Chenwei Wan to the DIG team as a research engineer! Chenwei will work on Non-named entity representation in knowledge bases, with the goal to start a thèse CIFRE with Schlumberger. Welcome, ChenWei!
Mar 09, 2026: Paper accepted at EACL 2026
Congratulations to Cristian Santini, Marieke van Erp, and Mehwish Alam for “It’s All About the Confidence: An Unsupervised Approach for Multilingual Historical Entity Linking using Large Language Models”
Feb 18, 2026: DIG has five articles accepted at ICLR 2026
-
TabStruct: Measuring Structural Fidelity of Tabular Data. Xiangjian Jiang, Nikola Simidjievski, Mateja Jamnik
-
Query-Level Uncertainty in Large Language Models. Lihu Chen, Fabian M. Suchanek, Gaël Varoquaux, Gerard de Melo
-
Efficient Resource Constrained Training of Vision Transformers via Subspace Optimization. Le-Trung Nguyen, Enzo Tartaglione, Van-Tam Nguyen
-
Study of Training Dynamics for Memory-Constrained Fine-tuning. Aël Quélennec, Nour Hezbri, Pavlo Mozharovskyi, Van-Tam Nguyen, Enzo Tartaglione
-
INSTANT: Compressing Gradients and Activations for Resource-Efficient Training. Tuan-Kiet Doan, Trung-Hieu Tran, Enzo Tartaglione, Nikola Simidjievski, Van-Tam Nguyen
Jan 26, 2026: IMT Pedagogy Prize honors free software course
Congratulations to Marc Jeanmougin and Théo Zimmermann for receiving the “Engagement, Pedagogy, and Teaching” award (emerging initiative category) from Institut Mines-Télécom (IMT) for their innovative course on open-source contributions. This program provides students with hands-on experience by having their code modifications integrated into real-world software projects. (News Source)
Dec 01, 2025: Yanzhu Guo joined DIG
We’re happy that Yanzhu Guo joined us as an assistant professor in the team! Welcome, Yanzhu!
Nov 01, 2025: Best Paper Award at ISWC 2025
Yiwen Peng, Thomas Bonald and Fabian Suchanek received the Best Paper Award at ISWC 2025 for their paper “FLORA: Unsupervised Knowledge Graph Alignment by Fuzzy Logic.”