The underlying library is called "pytorch-pretrained-BERT" because initially it just contained an implementation of BERT, but now it contains implementations of several models so they backronym-ed it to "Big-&-Extending-Repository-of-Transformers". :)
The underlying library is called "pytorch-pretrained-BERT" because initially it just contained an implementation of BERT, but now it contains implementations of several models so they backronym-ed it to "Big-&-Extending-Repository-of-Transformers". :)