Learn to Compress & Compress to Learn

Workshop at the International Symposium on Information Theory (ISIT) 2026

Recent advances in machine learning and artificial intelligence have brought compression back to the forefront of information science. Once regarded primarily as a tool for efficient data storage and transmission, compression has now emerged as a unifying principle linking representation learning, generalization, and efficient communication. This workshop explores how classical information-theoretic concepts—such as the rate–distortion tradeoff, minimum description length, and universal compression—are being reimagined and extended in modern contexts like neural source coding, model compression, semantic communication, and generative AI. It aims to foster dialogue between information theorists and machine learning researchers to examine how compression not only enables efficient inference and transmission but also offers a powerful lens for explaining and designing intelligent systems. The program will feature invited talks, contributed presentations, and panel discussions that bridge theory and practice, laying the groundwork for the next generation of compression-inspired learning and communication paradigms.

The workshop will be held on July 3, 2026 at the Guangzhou Yuexiu International Congress Center.

Submissions are now open on EDAS!

Important Dates

  • Submission of Workshop Papers: April 7, 2026
  • Notification of Acceptance: April 21, 2026
  • Final Manuscripts: April 28, 2026
  • Workshop Date: July 3, 2026

Keynote Speakers

Bo Bai

Huawei

Zoe (Yuxin) Liu

Visionular

Aaron Wagner

Cornell University

Invited Speakers

Yanjun Han

New York University

Xueyan Niu

Huawei

Wenqi Shao

Shanghai AI Lab

Organizers

Jun Chen

McMaster University

Ezgi Ozyilkan

New York University

Yong Fang

Chang'an University

Prof. Elza Erkip

New York University

Webmaster

Gergely Flamich

Imperial College London

Questions

Contact us at learn.to.compress.workshop@gmail.com.