Till Aczel

PhD student at ETH Zurich

Till Aczel

I am a PhD student at ETH Zurich under the supervision of Prof. Dr. Roger Wattenhofer. My research sits at the intersection of neural image compression and perceptual quality, aiming to build AI systems that are both efficient and aligned with human judgment of visual fidelity.

A key focus of my work is on evaluation: measuring the quality of generative AI is far from straightforward, because standard metrics often fail to capture what humans actually perceive. I develop methods and benchmarks to better assess perceptual fidelity, ensuring that model improvements translate into reconstructions that real people judge as high-quality and faithful.

Another focus is on efficiency and deployment: most generative models, including diffusion models, GANs, and VAEs, are computationally expensive. I design networks that "speak the language of computers", using architectures optimized for binary computation and hardware efficiency. This approach helps close the gap between learned and traditional codecs by reducing model complexity and runtime while improving perceptual quality and robustness.

News

Sep 25,2025

Our work, 'Mind the Gap: Removing the Discretization Gap in Differentiable Logic Gate Networks', is accepted at NeurIPS2025.

Apr 1,2025

Joined the CLIC2025 organizing team, where I am carrying out the human evaluation study for the challenge.

Nov 26,2024

Our work, 'Conditional Hallucinations for Image Compression' is accepted at the 2025 Data Compression Conference (DCC).

Jun 13,2024

Successfully passed my Aptitude Colloquium, securing my official status as a PhD student at ETH Zurich and taking the next step in my research journey.

Dec 5,2023

Became head teaching assistant for the Hands-On Deep Learning course, completely overhauling and modernizing the curriculum to make it more engaging, challenging, and relevant for students.

Jul 1,2023

Officially began my PhD at ETH Zurich under the mentorship of Roger Wattenhofer.

Selected Publications

Efficient Bayesian Inference from Noisy Pairwise Comparisons

Till Aczel; Lucas Theis; Roger Wattenhofer

arXiv preprint arXiv:2510.093332025

TL;DR

Modeling rater noise to downweight unreliable annotators, BBQ makes human evaluation of AI-generated content more accurate, stable, and cost-effective.

Paper

Mind the Gap: Removing the Discretization Gap in Differentiable Logic Gate Networks

Shakir Yousefi; Andreas Plesner; Till Aczel; Roger Wattenhofer

Neural Information Processing Systems (NeurIPS)2025

TL;DR

Gumbel noise is introduced to differentiable logic gate networks, reducing the discretization gap and enabling much faster model convergence.

Paper

Conditional Hallucinations for Image Compression

Till Aczel; Roger Wattenhofer

2025 Data Compression Conference (DCC)2025

TL;DR

ConHa is a learned image compression method that adjusts hallucinations to keep images realistic and artifact-free, better matching human preferences.

Paper

IC-DLC: Differentiable Logic Circuits for Hardware-Friendly Image Compression

Till Aczel; David F. Jenny; Simon Jonas Bührer; Andreas Plesner; Antonio Di Maio; Roger Wattenhofer

Machine Learning for Wireless Communication and Networks @ AAAI2026

TL;DR

Hardware-friendly image compression using differentiable logic circuits, enabling efficient deployment on edge devices while maintaining high compression ratios and perceptual quality.

The Unwinnable Arms Race of AI Image Detection

Till Aczel; Lorenzo Vettor; Andreas Plesner; Roger Wattenhofer

What Makes a Good Video: Next Practices in Video Generation and Evaluation Workshop @ NeurIPS2025

TL;DR

The arms race between AI image generators and detectors is unwinnable, with medium-complexity distributions the easiest to detect.

Paper

Light Differentiable Logic Gate Networks

Lukas Rüttgers; Till Aczel; Andreas Plesner; Roger Wattenhofer

arXiv preprint arXiv:2510.032502025

TL;DR

A new reparametrization of differentiable logic gate networks reduces model size, speeds up training, and stabilizes accuracy, enabling deeper, more efficient networks without losing performance.

Paper

HyperCool: Reducing Encoding Cost in Overfitted Codecs with Hypernetworks

Pep Borrell-Tatché; Till Aczel; Théo Ladune; Roger Wattenhofer

Machine Learning for Wireless Communication and Networks @ AAAI2025

TL;DR

HyperCool uses hypernetworks to reduce encoding costs in overfitted neural codecs, enabling more efficient compression without sacrificing reconstruction quality.

Paper

Teaching

Hands-On Deep Learning

A practical deep learning course where students move from theory to real-world implementation. They learn to build models in PyTorch across image classification and generation, audio denoising, natural language processing, graph neural networks, and reinforcement learning.

Course Overhaul: When I took over the course, enrollment was around 30 students. I scaled it to 150, modernized the content, and reworked the assignments to ensure they meaningfully test understanding.

Key Improvements

  • Scalable Infrastructure: Built automated GPU-backed grading on the CodeExpert platform using SLURM. This infrastructure allows the course to scale from 30 to 150 students while keeping evaluations reliable and fair.
  • Discussions: Introduced interactive 2-on-1 sessions with students and TAs. These ensure genuine understanding despite ChatGPT assistance and give students space to clarify concepts through real dialogue.
  • Modernized Content: Added topics such as diffusion models, LoRA fine-tuning, next-token prediction pretraining, and other contemporary deep learning techniques to keep the course aligned with current research.
  • Competitive Challenge: Introduced biweekly challenges with a public leaderboard, where students compete for the top 3 spots, making the course more engaging and motivating.

Thesis Supervision

If you want to pursue a thesis with me, please check my university page. I welcome original ideas.

In ProgressSemester ProjectSpring 2026

Normalized Vision Transformers

Student: Ivanovas Anselm

Co-supervised by: Andreas Plesner

In ProgressBachelor's ThesisFall 2025

Automated Foosball Commentary

Student: Victor Willems

Co-supervised by: Joël Mathys

In ProgressMaster's ThesisFall 2025

Cool-chic

Student: Jakub Parada

In ProgressMaster's ThesisFall 2025

DiffLUT

Student: Simon Bührer

Co-supervised by: Andreas Plesner

In ProgressMaster's ThesisFall 2025

Exploring normalized transformers

Student: Shakir Yousefi

Co-supervised by: Andreas Plesner

In ProgressSemester ProjectSpring 2025

Faster DiffLogic

Student: Joshua Durrant

Co-supervised by: Andreas Plesner

In ProgressSemester ProjectSpring 2025

Scalar Quantization for Audio Compression

Student: Fei Gao

Co-supervised by: Luca Lanzendörfer

In ProgressMaster's ThesisSpring 2025

View-Specific Video Compression

Student: Niklas Pohl

✓ CompletedBachelor's ThesisSpring 2025

Eigenvector-Masked Autoencoders

Student: Anja Buchmann

Co-supervised by: Andreas Plesner

✓ CompletedSemester ProjectSpring 2025

Deep Differentiable Logic Gate Networks: Neuron Collapse Through a Neural Architecture Search Perspective

Student: Shakir Yousefi

Co-supervised by: Andreas Plesner

✓ CompletedSemester ProjectSpring 2025

Recurrent Deep Differentiable Logic Gate Networks

Student: Simon Bührer

Co-supervised by: Andreas Plesner

✓ CompletedBachelor's ThesisSpring 2025

Effectiveness of Multi-Scale Aggregation for Adversarial Robustness

Student: Emerson Leonardo Azevedo Aguiar

Co-supervised by: Andreas Plesner

✓ CompletedBachelor's ThesisSpring 2025

Evaluating AI-Generated Image Detection Across Resolution and Complexity

Student: Lorenzo Alessandro Vettor

Co-supervised by: Andreas Plesner

✓ CompletedBachelor's ThesisSpring 2025

The Impact of Training Data on Adversarial Examples

Student: Marco Zimmerli

Co-supervised by: Andreas Plesner

✓ CompletedSemester ProjectSpring 2025

Light Differentiable Logic Gate Networks

Student: Lukas Rüttgers

Co-supervised by: Andreas Plesner

✓ CompletedSemester ProjectSpring 2025

Scalability and Expressiveness of the Group-Sum Layer in Differentiable Logic Gate Networks

Student: Sven Brändle

Co-supervised by: Andreas Plesner

✓ CompletedMaster's ThesisSpring 2025

From E-Commerce to Editorials: Garment Retrieval for Virtual Photo-Shoot Applications

Student: Yannick Hauri

Co-supervised by: Luca Lanzendörfer

✓ CompletedMaster's ThesisSpring 2025

Fine-tuning Data Extraction from Large Language Models

Student: Samuel Räber

Co-supervised by: Andreas Plesner

✓ CompletedBachelor's ThesisSpring 2025

A Dual Study on Analyzing Image Recompression Effects and Realism Assessment via Kolmogorov Complexity

Student: Elif Özsoy

✓ CompletedBachelor's ThesisSpring 2025

Synthetic Data Augmentation in Medical Image Classification: How Architecture Determines Utility

Student: Lars Ruschak

Co-supervised by: Andreas Plesner

✓ CompletedBachelor's ThesisFall 2024

Automated Visual Foosball Tracking

Student: Linus Baumberger

Co-supervised by: Joël Mathys

✓ CompletedSemester ProjectFall 2024

Human-aligned Compression for Robust Models

Student: Samuel Räber

Co-supervised by: Andreas Plesner

✓ CompletedMaster's ThesisFall 2024

Beyond Overfitting: Encoding Shortcuts for Overfitted Image Codecs

Student: Josep Borrell Tatche

✓ CompletedBachelor's ThesisFall 2024

View-Specific Video Compression

Student: Sven Steffen

✓ CompletedSemester ProjectFall 2024

Wave Function Collapse for Graph Generation

Student: Soufiane Barrada

Co-supervised by: Joël Mathys

✓ CompletedSemester ProjectSpring 2024

Jigsaw Puzzle Solver using Machine Learning

Student: Lea Künstler

Co-supervised by: Andreas Plesner

✓ CompletedSemester ProjectFall 2023

Exploring Typical and Uncertainty-Driven Active Learning on DINO Embeddings to Enhance Versatility

Student: Paul Doucet

Co-supervised by: Benjamin Estermann

✓ CompletedGroup ProjectFall 2023

DataComp Challenge

Student: Dustin Brunner

Co-supervised by: Benjamin Estermann

✓ CompletedSemester ProjectFall 2023

SUPClust: Active Learning at the Boundaries

Student: Yuta Ono

Co-supervised by: Benjamin Estermann

Resume