Till Aczel
PhD student at ETH Zurich
I am a PhD student at ETH Zurich under the supervision of Prof. Dr. Roger Wattenhofer. My research sits at the intersection of neural image compression and perceptual quality, aiming to build AI systems that are both efficient and aligned with human judgment of visual fidelity.
A key focus of my work is on evaluation: measuring the quality of generative AI is far from straightforward, because standard metrics often fail to capture what humans actually perceive. I develop methods and benchmarks to better assess perceptual fidelity, ensuring that model improvements translate into reconstructions that real people judge as high-quality and faithful.
Another focus is on efficiency and deployment: most generative models, including diffusion models, GANs, and VAEs, are computationally expensive. I design networks that "speak the language of computers", using architectures optimized for binary computation and hardware efficiency. This approach helps close the gap between learned and traditional codecs by reducing model complexity and runtime while improving perceptual quality and robustness.
News
Our work, 'Mind the Gap: Removing the Discretization Gap in Differentiable Logic Gate Networks', is accepted at NeurIPS2025.
Joined the CLIC2025 organizing team, where I am carrying out the human evaluation study for the challenge.
Our work, 'Conditional Hallucinations for Image Compression' is accepted at the 2025 Data Compression Conference (DCC).
Successfully passed my Aptitude Colloquium, securing my official status as a PhD student at ETH Zurich and taking the next step in my research journey.
Became head teaching assistant for the Hands-On Deep Learning course, completely overhauling and modernizing the curriculum to make it more engaging, challenging, and relevant for students.
Officially began my PhD at ETH Zurich under the mentorship of Roger Wattenhofer.
Selected Publications
Efficient Bayesian Inference from Noisy Pairwise Comparisons
Till Aczel; Lucas Theis; Roger Wattenhofer
arXiv preprint arXiv:2510.09333 • 2025
TL;DR
Modeling rater noise to downweight unreliable annotators, BBQ makes human evaluation of AI-generated content more accurate, stable, and cost-effective.
Mind the Gap: Removing the Discretization Gap in Differentiable Logic Gate Networks
Shakir Yousefi; Andreas Plesner; Till Aczel; Roger Wattenhofer
Neural Information Processing Systems (NeurIPS) • 2025
TL;DR
Gumbel noise is introduced to differentiable logic gate networks, reducing the discretization gap and enabling much faster model convergence.
Conditional Hallucinations for Image Compression
Till Aczel; Roger Wattenhofer
2025 Data Compression Conference (DCC) • 2025
TL;DR
ConHa is a learned image compression method that adjusts hallucinations to keep images realistic and artifact-free, better matching human preferences.
IC-DLC: Differentiable Logic Circuits for Hardware-Friendly Image Compression
Till Aczel; David F. Jenny; Simon Jonas Bührer; Andreas Plesner; Antonio Di Maio; Roger Wattenhofer
Machine Learning for Wireless Communication and Networks @ AAAI • 2026
TL;DR
Hardware-friendly image compression using differentiable logic circuits, enabling efficient deployment on edge devices while maintaining high compression ratios and perceptual quality.
The Unwinnable Arms Race of AI Image Detection
Till Aczel; Lorenzo Vettor; Andreas Plesner; Roger Wattenhofer
What Makes a Good Video: Next Practices in Video Generation and Evaluation Workshop @ NeurIPS • 2025
TL;DR
The arms race between AI image generators and detectors is unwinnable, with medium-complexity distributions the easiest to detect.
Light Differentiable Logic Gate Networks
Lukas Rüttgers; Till Aczel; Andreas Plesner; Roger Wattenhofer
arXiv preprint arXiv:2510.03250 • 2025
TL;DR
A new reparametrization of differentiable logic gate networks reduces model size, speeds up training, and stabilizes accuracy, enabling deeper, more efficient networks without losing performance.
HyperCool: Reducing Encoding Cost in Overfitted Codecs with Hypernetworks
Pep Borrell-Tatché; Till Aczel; Théo Ladune; Roger Wattenhofer
Machine Learning for Wireless Communication and Networks @ AAAI • 2025
TL;DR
HyperCool uses hypernetworks to reduce encoding costs in overfitted neural codecs, enabling more efficient compression without sacrificing reconstruction quality.
Teaching
Hands-On Deep Learning
A practical deep learning course where students move from theory to real-world implementation. They learn to build models in PyTorch across image classification and generation, audio denoising, natural language processing, graph neural networks, and reinforcement learning.
Course Overhaul: When I took over the course, enrollment was around 30 students. I scaled it to 150, modernized the content, and reworked the assignments to ensure they meaningfully test understanding.
Key Improvements
- •Scalable Infrastructure: Built automated GPU-backed grading on the CodeExpert platform using SLURM. This infrastructure allows the course to scale from 30 to 150 students while keeping evaluations reliable and fair.
- •Discussions: Introduced interactive 2-on-1 sessions with students and TAs. These ensure genuine understanding despite ChatGPT assistance and give students space to clarify concepts through real dialogue.
- •Modernized Content: Added topics such as diffusion models, LoRA fine-tuning, next-token prediction pretraining, and other contemporary deep learning techniques to keep the course aligned with current research.
- •Competitive Challenge: Introduced biweekly challenges with a public leaderboard, where students compete for the top 3 spots, making the course more engaging and motivating.
Thesis Supervision
If you want to pursue a thesis with me, please check my university page. I welcome original ideas.
Normalized Vision Transformers
Student: Ivanovas Anselm
Co-supervised by: Andreas Plesner
Automated Foosball Commentary
Student: Victor Willems
Co-supervised by: Joël Mathys
Cool-chic
Student: Jakub Parada
DiffLUT
Student: Simon Bührer
Co-supervised by: Andreas Plesner
Exploring normalized transformers
Student: Shakir Yousefi
Co-supervised by: Andreas Plesner
Faster DiffLogic
Student: Joshua Durrant
Co-supervised by: Andreas Plesner
Scalar Quantization for Audio Compression
Student: Fei Gao
Co-supervised by: Luca Lanzendörfer
View-Specific Video Compression
Student: Niklas Pohl
Eigenvector-Masked Autoencoders
Student: Anja Buchmann
Co-supervised by: Andreas Plesner
Deep Differentiable Logic Gate Networks: Neuron Collapse Through a Neural Architecture Search Perspective
Student: Shakir Yousefi
Co-supervised by: Andreas Plesner
Recurrent Deep Differentiable Logic Gate Networks
Student: Simon Bührer
Co-supervised by: Andreas Plesner
Effectiveness of Multi-Scale Aggregation for Adversarial Robustness
Student: Emerson Leonardo Azevedo Aguiar
Co-supervised by: Andreas Plesner
Evaluating AI-Generated Image Detection Across Resolution and Complexity
Student: Lorenzo Alessandro Vettor
Co-supervised by: Andreas Plesner
The Impact of Training Data on Adversarial Examples
Student: Marco Zimmerli
Co-supervised by: Andreas Plesner
Light Differentiable Logic Gate Networks
Student: Lukas Rüttgers
Co-supervised by: Andreas Plesner
Scalability and Expressiveness of the Group-Sum Layer in Differentiable Logic Gate Networks
Student: Sven Brändle
Co-supervised by: Andreas Plesner
From E-Commerce to Editorials: Garment Retrieval for Virtual Photo-Shoot Applications
Student: Yannick Hauri
Co-supervised by: Luca Lanzendörfer
Fine-tuning Data Extraction from Large Language Models
Student: Samuel Räber
Co-supervised by: Andreas Plesner
A Dual Study on Analyzing Image Recompression Effects and Realism Assessment via Kolmogorov Complexity
Student: Elif Özsoy
Synthetic Data Augmentation in Medical Image Classification: How Architecture Determines Utility
Student: Lars Ruschak
Co-supervised by: Andreas Plesner
Automated Visual Foosball Tracking
Student: Linus Baumberger
Co-supervised by: Joël Mathys
Human-aligned Compression for Robust Models
Student: Samuel Räber
Co-supervised by: Andreas Plesner
Beyond Overfitting: Encoding Shortcuts for Overfitted Image Codecs
Student: Josep Borrell Tatche
View-Specific Video Compression
Student: Sven Steffen
Wave Function Collapse for Graph Generation
Student: Soufiane Barrada
Co-supervised by: Joël Mathys
Jigsaw Puzzle Solver using Machine Learning
Student: Lea Künstler
Co-supervised by: Andreas Plesner
Exploring Typical and Uncertainty-Driven Active Learning on DINO Embeddings to Enhance Versatility
Student: Paul Doucet
Co-supervised by: Benjamin Estermann
DataComp Challenge
Student: Dustin Brunner
Co-supervised by: Benjamin Estermann
SUPClust: Active Learning at the Boundaries
Student: Yuta Ono
Co-supervised by: Benjamin Estermann
