Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
thomasahle
on July 19, 2024
|
parent
|
context
|
favorite
| on:
What happened to BERT and T5?
If you look at the classical [transformer architecture picture](
https://en.wikipedia.org/wiki/Transformer_(deep_learning_arc...
) there is an "encoder" tower on the left and a "decoder" tower on the right.
- Bert is encoder only.
- GPT is decoder only.
- T5 uses both the encoder and the decoder.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search:
- Bert is encoder only.
- GPT is decoder only.
- T5 uses both the encoder and the decoder.