James (Jamie) Townsend
I am a machine learning researcher, based at the Amsterdam Machine Learning Lab (AMLab). I completed my PhD, on lossless compression with latent variable models, in 2020, supervised by Professor David Barber at the UCL AI Centre in London. Most of my research to date has been on deep generative models and lossless compression. I'm also interested in unsupervised learning more generally, approximate inference, Monte Carlo methods, optimization and the design of machine learning software systems.
During the PhD I spent a lot of time working on the Python/NumPy automatic differentiation software Autograd. I interned under the tutelage of Matthew Johnson at Google Brain in San Francisco in Spring 2018, where I was fortunate enough to work on JAX during the early stages of the project.
I use the name James on publications and official documents; for everything else, I use Jamie.
Github: @j-townsTwitter: @_j_towns
Google Scholar: James Townsend
Email:
j.h.n.townsend@uva.nl
Publications and preprints
- Julius Kunze, Daniel Severo, Giulio Zani, Jan-Willem van de Meent, James Townsend, Entropy Coding of Unordered Data Structures, International Conference on Learning Representations (ICLR), 2024.
- Daniel Severo, James Townsend, Ashish Khisti, Alireza Makhzani, Random Edge Coding: One-Shot Bits-Back Coding of Large Labeled Graphs, International Conference on Machine Learning (ICML), 2023. ArXiv preprint: https://arxiv.org/abs/2305.09705.
- James Townsend and Jan-Willem van de Meent, Verified Reversible Programming for Verified Lossless Compression, 2022. Presented at the Languages for Inference (LAFI) workshop at POPL 2023. ArXiv preprint: https://arxiv.org/abs/2211.09676. Video: https://youtu.be/w8st4mOajgs?t=5742.
- Daniel Severo*, James Townsend*, Ashish Khisti, Alireza Makhzani, and Karen Ullrich, Your Dataset is a Multiset and You Should Compress it Like One, 2021. Awarded best paper at the Deep Generative Models and Downstream Applications Workshop. OpenReview: https://openreview.net/forum?id=vjrsNCu8Km. *Equal contribution.
- Julius Kunze, James Townsend, and David Barber, Adaptive Optimization with Examplewise Gradients, OPT2021: 13th Annual Workshop on Optimization for Machine Learning, 2021. ArXiv preprint: https://arxiv.org/abs/2112.00174.
- Daniel Severo*, James Townsend*, Ashish Khisti, Alireza Makhzani, and Karen Ullrich, Compressing Multisets with Large Alphabets, appearing at the Data Compression Conference (DCC), 2022. ArXiv preprint: https://arxiv.org/abs/2001.09186. *Equal contribution.
- Yangjun Ruan, Karen Ullrich, Daniel Severo, James Townsend, Ashish Khisti, Arnaud Doucet, Alireza Makhzani, and Chris J. Maddison, Improving Lossless Compression Rates via Monte Carlo Bits-Back Coding, International Conference on Machine Learning (ICML), 2021. ArXiv preprint: https://arxiv.org/abs/2102.11086.
- James Townsend and Iain Murray, Lossless Compression with State Space Models Using Bits Back Coding, Neural Compression: From Information Theory to Applications -- Workshop @ ICLR 2021.
- James Townsend, Lossless Compression with Latent Variable Models, PhD Thesis, 2021. ArXiv preprint: https://arxiv.org/abs/2104.10544.
- James Townsend*, Thomas Bird*, Julius Kunze, and David Barber, HiLLoC: Lossless Image Compression with Hierarchical Latent Variable Models, International Conference on Learning Representations (ICLR), 2020. *Equal contribution.
- James Townsend, A Tutorial on the Range Variant of Asymmetric Numeral Systems, 2020. ArXiv preprint: https://arxiv.org/abs/2001.09186.
- James Townsend, Thomas Bird, and David Barber, Practical Lossless Compression with Latent Variables Using Bits Back Coding, International Conference on Learning Representations (ICLR), 2019.
- Jonathan So, James Townsend, and Benoit Gaujac, EP Structured Variational Autoencoders, 1st Symposium on Advances in Approximate Bayesian Inference, 2018.
- James Townsend, Niklas Koep, and Sebastian Weichwald, Pymanopt: A Python Toolbox for Optimization on Manifolds using Automatic Differentiation, Journal of Machine Learning Research, vol. 17, no. 137, pp. 1–5, 2016.