Ashish Vaswani

Indian computer scientist From Wikipedia, the free encyclopedia

Ashish Vaswani (born 1986)[1] is an Indian computer scientist. Vaswani conducted research at Google Brain and, earlier in his career, was affiliated with the Information Sciences Institute at the University of Southern California.

Born1987 (age 3839)
India
Almamater
Quick facts Born, Alma mater ...
Ashish Vaswani
Born1987 (age 3839)
India
Alma mater
Known forTransformer (deep learning architecture)
Scientific career
Fields
Institutions
Thesis Smaller, Faster, and Accurate Models for Statistical Machine Translation  (2014)
Doctoral advisor
  • David Chiang
  • Liang Huang
Close

Vaswani is a co-author of the 2017 paper "Attention Is All You Need", which introduced the Transformer neural network architecture.[2] The Transformer model has been used in the development of subsequent NLP models BERT, ChatGPT, and their successors.

Career

Vaswani completed his engineering in Computer Science from Birla Institute of Technology, Mesra (BIT Mesra) in 2002. In 2004, he enrolled at the University of Southern California for graduate studies.[3] He earned his PhD in Computer Science at the University of Southern California supervised by David Chiang.[4][5] During his research career at Google,[6] Vaswani was part of the Google Brain team, where he conducted the work leading to the 'Attention Is All You Need' publication. Prior to joining Google, he was affiliated with the Information Sciences Institute at the University of Southern California.

After Google, Vaswani co-founded Adept AI, a machine learning-focused startup that developed AI agents and tools for software automation. He has since left the company.[7][8] He is currently co-founder and CEO of Essential AI.

Notable works

Vaswani's most notable paper, "Attention Is All You Need", was published in 2017.[9] The paper introduced the Transformer model, which uses self-attention mechanisms instead of recurrence for sequence-to-sequence tasks.

The Transformer architecture has become foundational to modern language models and NLP systems, including BERT (2018),[10] GPT-2, GPT-3 (2019–2020) and many more recent models. The "Attention Is All You Need" paper is among the most cited papers in machine learning.

References

Related Articles

Wikiwand AI