Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I am 100% convinced that "traditional" Google search moved to come kind of embeddings distance, LLM based algorithm at some point in the last 12 months.

I've noticed a distinct change in the results where now what I get back seems MUCH more related to "semantic similarity" with my search query I actually entered.

Great examples of this are when you search for a highly common topic, but specifically related to a highly niche sub-variant or focus area.

"Old" Google would have hooked into the edge-case as being a critical part of the query and zeroed in on that specificity (which you wanted), whereas now it will just go "these results were semantically similar to 95% of your search query, therefore they must be most relevant" - totally ignoring that the 5% it ignored was the critical differentiator.

Another way to trigger this behaviour really easily is to look for a contrarian view on something widely discussed by adding "not" or "doesn't" into the query. Google will just straight up ignore your input, returning a tonne of results that are 95% semantic matches for the words you used, but missing that ONE tiny point where you were searching for literally the opposite of what it returned.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: