Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The language model is GPT-2 (small).

The underlying library is called "pytorch-pretrained-BERT" because initially it just contained an implementation of BERT, but now it contains implementations of several models so they backronym-ed it to "Big-&-Extending-Repository-of-Transformers". :)



Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: