Aakanksha Chowdhery

STAFF RESEARCH SCIENTIST, GOOGLE DEEPMIND

Lead Researcher on pre-training, scaling, and finetuning of Large Language Models. Technical Lead of 540B PaLM model. Core contributor in Gemini and Pathways project at Google. Prior to joining Google, I was technical lead for interdisciplinary research initiatives at Microsoft Research and Princeton University across machine learning and distributed systems.

I completed my PhD in Electrical Engineering from Stanford University in 2013, and was awarded the Paul Baran Marconi Young Scholar Award for the outstanding scientific contributions of my dissertation in the field of communications and Internet.

Selected Honors & Awards: Outstanding Paper Award MLSys 2023, Outstanding Paper Award MLSys 2022, Paul Baran Marconi Young Scholar Award 2012.

Selected
Publications

MACHINE LEARNING

  • Gemini Team et al, “Gemini: A Family of Highly Capable Multimodal Models,” arXiv preprint arXiv:2312.11805, Dec 2023.

  • A. Chowdhery, S. Narang, J. Devlin, et al, “PaLM: Scaling Language Modeling with Pathways,” Journal of Machine Learning Research (JMLR), 2023.

  • H. W. Chung, L. Hou, J. Wei, et al, “Scaling instruction-finetuned language models,” Journal of Machine Learning Research (JMLR), 2024.

  • D. Driess, F. Xia, P. Florence et al, “PaLM-E: An Embodied Multimodal Language Model,” International Conference on Machine Learning (ICML), 2023.

  • K. Singhal, S. Azizi, et al “Large language models encode clinical knowledge”, Nature 620, 172–180 (2023).

  • X. Wang, J. Wei, et al, “Self-consistency improves chain of thought reasoning in language models”, International Conference on Learning Representations (ICLR), 2023.

  • S. Jaszczur, A. Chowdhery, A. Mohiuddin, L. Kaiser, W. Gajewski, H. Michalewski, and J. Kanerva, “Sparse is Enough in Scaling Transformers,” NeurIPS 2021.

  • H. Hazimeh, Z. Zhao, A. Chowdhery, M. Sathiamoorthy, et al, “Dselect-k: Differentiable selection in the mixture of experts with applications to multi-task learning,” NeurIPS 2021.

SYSTEMS

  • R. Pope, S. Douglas, A. Chowdhery, et al, “Efficiently scaling transformer inference,” Proceedings of Machine Learning and Systems (MLSys), 2023, Outstanding Paper Award.

  • P. Barham, A. Chowdhery, J. Dean, S. Ghemawat, S. Hand, D. Hurt, M. Isard, H. Lim, R. Pang, S. Roy, R. Sepassi et al, “Pathways: Asynchronous Distributed Dataflow for ML,” Proceedings of Machine Learning and Systems (MLSys), 2022, Outstanding Paper Award.

  • Y. Lu, A. Chowdhery, S. Kandula, and S. Chaudhuri, “Accelerating Machine Learning Queries with Probabilistic Predicates,” ACM SIGMOD 2018.

  • Y. Lu, A. Chowdhery, and S. Kandula, “Optasia: A Relational Platform for Efficient Large-Scale Video Analytics,” ACM Symposium on Cloud Computing (SoCC), Santa Clara, CA, 2016.

Recent
Talks

Stanford CS 528 Talk on Multimodal Reasoning: Gemini & PaLM-E 2024 (YouTube)

Invited talks at ICLR 2023, ICML 2023 and NeurIPS 2022 workshops

Keynote Panelist at NeurIPS 2023 Panel on “Beyond Scaling”

Keynote Speaker at PIMRC 2023 and ITA 2023

Keynote Speaker at Department of Energy Advanced Scientific Computing Research Meeting 2024

Berkeley CS 294 Talk on Scaling Large Language Models 2023