avatar

Lily H. Zhang

Machine Learning and Statistics

Hi! I'm a PhD candidate at New York University and fortunate to be advised by Professor Rajesh Ranganath and Professor Kyle Cranmer. I'm grateful to be a DeepMind Scholar.

I'm interested in improving the reliability of machine learning. I'm broadly interested in deep learning, generative modeling, and out-of-distribution detection and generalization. I also work on applications of machine learning for health and science.

I graduated from Harvard University with a bachelor's in statistics (primary) and computer science (secondary). There, I've had the pleasure to work with Professors Don Rubin (statistics), Jukka-Pekka ā€œJPā€ Onnela (biostatistics), and Gary King (quantitative social science). I've also worked for several machine learning start ups and Google search.

Feel free to contact me for research collaborations or other engagements.

Publications

Robustness to Spurious Correlations Improves Semantic Out-of-Distribution Detection
Lily H. Zhang and Rajesh Ranganath.
Association for the Advancement of Artificial Intelligence (AAAI), 2023.
[Paper] [Code]

When More is Less: Incorporating Additional Datasets Can Hurt Performance By Introducing Spurious Correlations
Rhys Compton, Lily H. Zhang, Aahlad Puli, and Rajesh Ranganath.
Machine Learning for Healthcare (MLHC), 2023.
[Paper] [Code]

Set Norm and Equivariant Residual Connections: Putting the Deep in Deep Sets
Lily H. Zhang*, Veronica Tozzo*, John M. Higgins, Rajesh Ranganath.
International Conference on Machine Learning (ICML), 2022.
[Paper] [Code]

Out-of-Distribution Generalization in the Presence of Nuisance-Induced Spurious Correlations
Aahlad Puli, Lily H. Zhang, Eric Oermann, Rajesh Ranganath.
International Conference on Learning Representations (ICLR), 2022.
[Paper] [Code]

Understanding Out-of-Distribution Detection with Deep Generative Models
Lily H. Zhang, Mark Goldstein, Rajesh Ranganath
International Conference on Machine Learning (ICML), 2021.
[Paper] [Talk]

Rapid Model Comparison by Amortizing Across Models
Lily H. Zhang and Michael Hughes.
Proceedings of The 2nd Symposium on Advances in Approximate Bayesian Inference (AABI), 2020.
[Paper] [Code]

Education

New York University, New York, NY. Candidate for Doctor of Philosophy in Data Science. Aug. 2020 ā€“ Present

Harvard College, Cambridge, MA. Bachelor of Arts in Statistics and Computer Science. Magna Cum Laude with High Honors. Aug. 2013 ā€“ May 2017

Honors & Awards

DeepMind Fellow, 2020.
Phi Beta Kappa, 2017.

Patents

Graphical user interface systems for generating hierarchical data extraction training dataset.