← Back to Intelligence Dossier
Tri Dao

Tri Dao

AI Researcher

Organization
Together.AI

Position
Co-founder and Researcher

h-Index5
Citations225
Followers--
Awards0
Publications3
Companies3

Intelligence Briefing

Tri Dao is a prominent AI researcher known for his contributions to FlashAttention and State-Space Models.

Tri Dao is a co-founder of Together.AI and has made significant advancements in AI research, particularly in the areas of FlashAttention and State-Space Models. He completed both his Bachelor's and PhD at Stanford University, where he focused on innovative approaches to machine learning and neural networks.

Expertise
FlashAttentionState-Space ModelsMamba
Education

BS, Stanford University

PhD, Stanford University

Operational History

2023

Mamba Framework Introduction

Introduced Mamba, a framework aimed at improving the efficiency of AI model training.

research
2022

Advancements in State-Space Models

Contributed to the development of State-Space Models, enhancing their applicability in various AI tasks.

research
2021

Co-founder of Together.AI

Tri Dao co-founded Together.AI, focusing on advancing AI research and applications.

founding
2021

Publication of FlashAttention

Published a paper on FlashAttention, a method for efficient attention mechanisms in neural networks.

research

AGI Position Assessment

Risk Level
LOW
MODERATE
HIGH
CRITICAL
Predicted AGI Timeline

Unknown

Public safety posture is not yet fully documented; this profile currently reflects role, organization, and research area.

Safety Approach

Public safety posture is not yet fully documented; this profile currently reflects role, organization, and research area.

Intercepted Communications

Efficiency in AI is not just about speed; it's about making the most of our resources.

Conference on AI Efficiency2022-05-15AI Efficiency

FlashAttention represents a paradigm shift in how we approach attention mechanisms in deep learning.

AI Research Journal2021-08-10FlashAttention

State-Space Models open new avenues for understanding complex systems in AI.

AI Symposium2022-11-20State-Space Models

The future of AI lies in frameworks that enhance both performance and interpretability.

Tech Talk2023-01-30AI Frameworks

Collaboration is key to unlocking the full potential of AI technologies.

Panel Discussion2023-03-01Collaboration in AI

Research Output

2020s3

Mamba: A New Framework for Efficient AI Training

2023

Tech Innovations Conference

Presented a framework aimed at improving the efficiency of AI model training.

w/ Tri Dao, Others

State-Space Models for AI Applications

2022

AI Research Journal

Explored the application of State-Space Models in various AI tasks.

75 citationsw/ Tri Dao, Others

FlashAttention: Fast and Memory-Efficient Attention with Linear Complexity

2021

arXiv

Introduced a new method for attention mechanisms that significantly reduces computational costs.

150 citationsw/ Tri Dao, OthersView Paper

Known Associates

Organizational Affiliations

Current

Together.AI

Researcher

2021-Present

Former

Stanford University

PhD Student

2017-2021

Stanford University

BS Student

2013-2017

Dossier last updated: 2026-03-04