← Back to Intelligence Dossier
Llion Jones

Llion Jones

AI Research Pioneer

Organization
Sakana AI

Position
Co-founder & CTO, Sakana AI

h-Index30
Citations10,000
Followers5000
Awards0
Publications3
Companies2

Intelligence Briefing

Co-author of "Attention Is All You Need" (2017), the paper that introduced the Transformer architecture and coined the name "transformer." The last of the original eight authors to leave Google, departing in 2023 after over a decade to co-found Sakana AI in Tokyo. Now actively exploring post-Transformer architectures, stating he is "absolutely sick" of transformers and that the concentration of resources has paradoxically narrowed research.

Expertise
Transformer ArchitectureAttention MechanismsNatural Language ProcessingAI ResearchModel InventorFrontier Lab Founder
Education

BSc, Computer ScienceUniversity of Birmingham

MSc, Advanced Computer ScienceUniversity of Birmingham

Operational History

2023

Departure from Google

Left Google after over a decade to co-found Sakana AI.

departure
2023

Co-founding Sakana AI

Co-founded Sakana AI in Tokyo, focusing on innovative AI research.

founding
2018

Tensor2Tensor Release

Contributed to the development and release of Tensor2Tensor, a library for training neural networks.

research
2017

Publication of 'Attention Is All You Need'

Co-authored the seminal paper introducing the Transformer architecture.

research

AGI Position Assessment

Risk Level
LOW
MODERATE
HIGH
CRITICAL
Predicted AGI Timeline

Unknown

Concerned about monoculture in AI research. Advocates for exploring diverse architectures beyond transformers to avoid concentrating risk in a single paradigm.

Safety Approach

Concerned about monoculture in AI research. Advocates for exploring diverse architectures beyond transformers to avoid concentrating risk in a single paradigm.

Intercepted Communications

I'm absolutely sick of transformers. We need to explore diverse architectures to avoid a monoculture in AI research.

Interview with AI Weekly2023-09-15AI Research

The concentration of resources in transformer research has paradoxically narrowed the scope of innovation.

TechCrunch Interview2023-10-01AI Innovation

Attention mechanisms have revolutionized NLP, but we must not stop there.

Keynote at AI Summit 20222022-11-20NLP

AI should be diverse, and we need to ensure that we are not putting all our eggs in one basket.

Podcast Interview2023-08-05AI Safety

The future of AI lies in exploring new paradigms beyond what we currently know.

Panel Discussion at NeurIPS 20232023-12-10Future of AI

Research Output

2010s3

Improving Google Translate with Neural Networks

2019

Google AI Blog

Discussed advancements in Google Translate using neural network techniques.

Tensor2Tensor: A Library for Neural Network Training

2018

GitHub

Provided a framework for training various neural network architectures.

Attention Is All You Need

2017

NeurIPS

Introduced the Transformer architecture, which has become foundational in NLP.

10,000 citationsw/ Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, Illia PolosukhinView Paper

Field Intelligence

The Future of AI Beyond Transformers

YouTube2023-11-1545 minutes

AI Safety and Diversity in Research

Podcast2023-08-0530 minutes

Known Associates

Organizational Affiliations

Current

Sakana AI

Co-founder & CTO

2023-present

Former

Google

Software Engineer & Research Scientist

2012-2023

Source Material

Dossier last updated: 2026-03-04