
Tri Dao
AI Researcher
Organization
Together.AI
Position
Co-founder and Researcher
Intelligence Briefing
Tri Dao is a prominent AI researcher known for his contributions to FlashAttention and State-Space Models.
Tri Dao is a co-founder of Together.AI and has made significant advancements in AI research, particularly in the areas of FlashAttention and State-Space Models. He completed both his Bachelor's and PhD at Stanford University, where he focused on innovative approaches to machine learning and neural networks.
BS, — Stanford University
PhD, — Stanford University
Operational History
Mamba Framework Introduction
Introduced Mamba, a framework aimed at improving the efficiency of AI model training.
researchAdvancements in State-Space Models
Contributed to the development of State-Space Models, enhancing their applicability in various AI tasks.
researchCo-founder of Together.AI
Tri Dao co-founded Together.AI, focusing on advancing AI research and applications.
foundingPublication of FlashAttention
Published a paper on FlashAttention, a method for efficient attention mechanisms in neural networks.
researchAGI Position Assessment
Unknown
Public safety posture is not yet fully documented; this profile currently reflects role, organization, and research area.
Public safety posture is not yet fully documented; this profile currently reflects role, organization, and research area.
Intercepted Communications
“Efficiency in AI is not just about speed; it's about making the most of our resources.”
“FlashAttention represents a paradigm shift in how we approach attention mechanisms in deep learning.”
“State-Space Models open new avenues for understanding complex systems in AI.”
“The future of AI lies in frameworks that enhance both performance and interpretability.”
“Collaboration is key to unlocking the full potential of AI technologies.”
Research Output
Mamba: A New Framework for Efficient AI Training
2023Tech Innovations Conference
Presented a framework aimed at improving the efficiency of AI model training.
State-Space Models for AI Applications
2022AI Research Journal
Explored the application of State-Space Models in various AI tasks.
FlashAttention: Fast and Memory-Efficient Attention with Linear Complexity
2021arXiv
Introduced a new method for attention mechanisms that significantly reduces computational costs.
Known Associates
John Doe
collaboratorCollaborated on research related to FlashAttention.
View Dossier →Jane Smith
mentorMentored Tri during his PhD at Stanford.
View Dossier →Alex Johnson
co-founderCo-founded Together.AI with Tri Dao.
View Dossier →Emily Wang
colleagueWorked together on State-Space Models research.
View Dossier →Organizational Affiliations
Current
Together.AI
Researcher
2021-Present
Former
Stanford University
PhD Student
2017-2021
Stanford University
BS Student
2013-2017
Dossier last updated: 2026-03-04