The RLFH data — users giving a thumbs up or down, or regenerating a response, or even getting detectably angry in their subsequent messages — is.
PageRank isn't a secret.
Google's version of RLFH — which link does a user click on, do they ignore the results without clicking any, do they write several related queries before clicking a link, do they return to the search results soon after clicking a link — are also secret.
That the Transformer model is a breakthrough doesn't make it a moat; that the Transformer model is public doesn't mean people using it don't have a moat.
Hence why I'm criticising the use of "moat" as mere parroting.
The RLFH data — users giving a thumbs up or down, or regenerating a response, or even getting detectably angry in their subsequent messages — is.
PageRank isn't a secret.
Google's version of RLFH — which link does a user click on, do they ignore the results without clicking any, do they write several related queries before clicking a link, do they return to the search results soon after clicking a link — are also secret.
That the Transformer model is a breakthrough doesn't make it a moat; that the Transformer model is public doesn't mean people using it don't have a moat.
Hence why I'm criticising the use of "moat" as mere parroting.