Latest news

[Mar 26] Article accepted for publicatin at IEEE Transactions on Circuits and Systems for Video Technology: Compression in 3D Gaussian Splatting: A Survey of Methods, Trends, and Future Directions. Muhammad Salman Ali, Chaoning Zhang, Marco Cagnazzo, Giuseppe Valenzise, Enzo Tartaglione, Sung-Ho Bae.

[Feb 26] Paper accepted at CVPR 2026: Bias In, Bias Out? Finding Unbiased Subnetworks in Vanilla Models.

[Dec 25] I have been promoted to the rank of Full Professor

[Nov 25] Article accepted for publication at Neurocomputing: TEP-ones: A Simple yet Effective Approach for Transferability Estimation of Pruned Backbones

[Nov 25] Article accepted for publication at IEEE Transactions on Multimedia: Security and Real-time FPGA integration for Learned Image Compression

Professorships, PhDs and stages applications are now open!

Two assistant professor positions (on traditional compression and animation with AI) will open soon!


PhD positions are now available spanning multimodality and efficiency! Reach out for more info


Looking for stagiaires! Email for more info!


Mission

In a world where deep learning is becoming more and more state-of-the-art, where the race to computational capabilities determines new technologies, it is crucial to open the black box that deep learning is. Many researchers are already moving important steps in such direction, despite a wide multitude and heterogeneity of scientific backgrounds. This is good — this is progress!

We target it in the long term, developing techniques which simplify these models. Some are easier to prune than others: why? How is information being processed inside a deep model from a macroscopic perspective? These are few of the questions to be answered to move in the right direction.

Green AI

Remotion of unnecessary neurons and/or synapses towards reduction of power consumption.

Model debiasing

Understand biases in data and cure the trained model.

Privacy in AI

Guaranteeing privacy in AI will be an important theme in the next years.

Understand the information flow

Modeling how the information is processed in deep models is our final goal.

Currently working with

Ekaterina Iakovleva

PostDoc
Deep Learning Unlearning

Frédéric Lauron

Research Engineer
Energy Consumption of Deep Learning

Imad Eddine Marouf

PhD student
Efficient transformers for computer vision
Previously: Stage M2 Feb-Sep 2022

Victor Quétu

PhD student
Regularization for deep learning


Yinghao Wang

PhD student
Foundation models for EEG treatment


Zhu Liao

PhD student
Deep Neural Network pruning



Dorian Gailhard

PhD student
Generative models and Graph Neural Networks for SoCs

Lê Trung Nguyen

PhD student
Methods for on-device training

Ivan Luiz De Moura Matos

PhD student
Debiasing through NAS
Previously: Stage M2 Oct 23-Feb 24

Leonardo Magliolo

PhD student
Information flow in Deep Neural Networks

Rayyan Ahmed

PhD student
Explaining and Removing Social Biases in Text-to-Image Generative AI

Cem Eteke

Invited PhD student
Compression of 3D Gaussian Splatting from a Frame Restoration Perspective

Charles Herr

Research path student
Learning alternatives to backpropagation

Lorenza Martins

PRIM project student
Learn how to sample debiasing masks for foundation models

Formerly advising