
Intelligence Briefing
Co-author of "Attention Is All You Need" (2017), the paper that introduced the Transformer architecture and coined the name "transformer." The last of the original eight authors to leave Google, departing in 2023 after over a decade to co-found Sakana AI in Tokyo. Now actively exploring post-Transformer architectures, stating he is "absolutely sick" of transformers and that the concentration of resources has paradoxically narrowed research.
BSc, Computer Science — University of Birmingham
MSc, Advanced Computer Science — University of Birmingham
Operational History
Departure from Google
Left Google after over a decade to co-found Sakana AI.
departureCo-founding Sakana AI
Co-founded Sakana AI in Tokyo, focusing on innovative AI research.
foundingTensor2Tensor Release
Contributed to the development and release of Tensor2Tensor, a library for training neural networks.
researchPublication of 'Attention Is All You Need'
Co-authored the seminal paper introducing the Transformer architecture.
researchAGI Position Assessment
Unknown
Concerned about monoculture in AI research. Advocates for exploring diverse architectures beyond transformers to avoid concentrating risk in a single paradigm.
Concerned about monoculture in AI research. Advocates for exploring diverse architectures beyond transformers to avoid concentrating risk in a single paradigm.
Intercepted Communications
“I'm absolutely sick of transformers. We need to explore diverse architectures to avoid a monoculture in AI research.”
“The concentration of resources in transformer research has paradoxically narrowed the scope of innovation.”
“Attention mechanisms have revolutionized NLP, but we must not stop there.”
“AI should be diverse, and we need to ensure that we are not putting all our eggs in one basket.”
“The future of AI lies in exploring new paradigms beyond what we currently know.”
Research Output
Improving Google Translate with Neural Networks
2019Google AI Blog
Discussed advancements in Google Translate using neural network techniques.
Tensor2Tensor: A Library for Neural Network Training
2018GitHub
Provided a framework for training various neural network architectures.
Attention Is All You Need
2017NeurIPS
Introduced the Transformer architecture, which has become foundational in NLP.
Field Intelligence
The Future of AI Beyond Transformers
AI Safety and Diversity in Research
Known Associates
Ashish Vaswani
collaboratorCo-authored the Transformer paper and worked together at Google.
View Dossier →Noam Shazeer
collaboratorCo-authored the Transformer paper and contributed to various AI projects.
View Dossier →Niki Parmar
collaboratorCo-authored the Transformer paper and worked on related AI research.
View Dossier →Aidan N. Gomez
collaboratorCo-authored the Transformer paper and engaged in AI research at Google.
View Dossier →Organizational Affiliations
Current
Sakana AI
Co-founder & CTO
2023-present
Former
Software Engineer & Research Scientist
2012-2023
Source Material
Dossier last updated: 2026-03-04