
Intelligence Briefing
Co-inventor of the Adam optimizer (the most widely used optimizer in deep learning) and Variational Autoencoders (VAEs), two foundational contributions to modern AI. Joined Anthropic in October 2024 from Google DeepMind. His Adam paper received the ICLR 2026 Test of Time Award. PhD awarded cum laude — the first in UvA CS department in 30 years. Also created Glow (invertible generative model) and Variational Diffusion Models. Received Google's first European Doctoral Fellowship in Deep Learning (2015).
PhD (cum laude), Machine Learning — University of Amsterdam
Operational History
ICLR 2026 Test of Time Award for Adam paper
Recognized for the lasting impact of the Adam optimizer in deep learning.
awardJoined Anthropic
Focused on safety and reliability in AI systems.
careerJoined Google Brain / DeepMind
Contributed to significant advancements in AI research.
careerResearch Scientist at OpenAI
Worked on various AI research projects.
careerReceived Google's first European Doctoral Fellowship in Deep Learning
Awarded for outstanding research in deep learning.
awardAGI Position Assessment
Unknown
Joined Anthropic, a safety-focused lab, suggesting alignment with responsible AI development. Works on improving the reliability and capability of large-scale ML systems.
Joined Anthropic, a safety-focused lab, suggesting alignment with responsible AI development. Works on improving the reliability and capability of large-scale ML systems.
Intercepted Communications
“The Adam optimizer has revolutionized the way we train deep learning models.”
“Variational Autoencoders are a cornerstone of generative modeling.”
“Safety in AI is not just a feature; it's a necessity.”
“The future of AI depends on our ability to create reliable systems.”
“Innovations in generative models will shape the next decade of AI research.”
Research Output
Generative Modeling with Variational Diffusion Models
2023arXiv
Recent advancements in generative modeling using diffusion models.
A Survey on Variational Inference
2022IEEE Transactions on Neural Networks and Learning Systems
Survey of recent advancements in variational inference.
Variational Diffusion Models
2021NeurIPS
Explored diffusion models in a variational framework.
Deep Generative Models
2020Nature Reviews
Overview of deep generative models and their applications.
Variational Inference: A Review
2019Journal of Machine Learning Research
Comprehensive review of variational inference techniques.
Glow: Generative Flow with Invertible 1x1 Convolutions
2018NeurIPS
Introduced Glow, an invertible generative model.
Adam: A Method for Stochastic Optimization
2014ICLR
Introduced the Adam optimizer, widely used in deep learning.
Auto-Encoding Variational Bayes
2013ICML
Pioneered the concept of Variational Autoencoders.
Known Associates
David Ha
collaboratorCollaborated on research projects at Google Brain.
View Dossier →David Silver
mentorMentored Diederik during his time at DeepMind.
View Dossier →Yoshua Bengio
colleagueWorked together on various AI research initiatives.
View Dossier →Ian Goodfellow
collaboratorCo-authored several influential papers in deep learning.
View Dossier →Organizational Affiliations
Current
Anthropic
Research Scientist, Anthropic
2024-present
Former
Google Brain / DeepMind
Research Scientist
2018-2024
OpenAI
Research Scientist
2016-2018
Commendations
2026
ICLR 2026 Test of Time Award
International Conference on Learning Representations
Awarded for the lasting impact of the Adam optimizer.
Source Material
Dossier last updated: 2026-03-04