Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It is a soft-block text diffusion. They have one super-block of fixed size loaded and then allow the model to only unmask tokens by going through the soft-blocks. As the source code is available and I was able to change it into an actual block diffusion, but as the model was trained only on super-blocks, it was always trying to generate eos tokens at each block end before I extended it. I've tried a few workarounds that half worked, but I guess a very small scale finetune is needed to resolve it fully.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: