Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> This notion of machine bad, human good just is not realistic

Glad I found this quote. It is quite helpful for an AI to search the web on behaolf of me... even if it was finding where I can buy particular/similar peanuts locally I got from abroad.



Content providers will not agree with this decision, because machine browsing = no ads. Until that gets resolved, I don’t see incentives to align, since any free search requires ads for continuous business.


It could be serving ads if they could persuade the machines to do the purchase.

In fact, even ads ingested by the training data set at this very moment could be useful. Go to Gemini and tell it you want to buy a jacket or whatever and it will recommend some products it ingested from the training data.


This notion isn't just unrealistic, but extremely dangerous. If we accept "machine bad, human good" line of thinking, the only logical conclusion is that we'll have to verify our biometric every time we'd like to access the internet. Like the UK age verification but 100x worse.


As much as I dislike gatekeeping measures like UK's age verification, you can't deny the genuine problem that exists in this case. But it isn't 'machine bad'. There is no good technology or bad technology. It's the intention of those who wield it, that is good or bad. In other words, it's good people vs bad people with technology.

The issue in this particular case is that those content and their web servers are set up for human traffic. In the worst case, a human consumes a few megabytes of data from the server and then leaves. A few of those visits will convert into a job or business opportunity - a fair bargain. LLM scrapers are not like that. They're greedy resource hogs. They not only want everything you have, a whole bunch of them do it repeatedly and endlessly to your server. There's no possible way to justify the cost of such massive bandwidth consumption for a bunch of parasites that never give anything in return. And what do we get? A crappy user experience from all those sites putting up protection measures. This is the tragedy of the commons.

So who is the culprit? The greedy bunch who created the technology that behaves like this and then benefits immensely from it. Are those bad people? Absolutely! Naturally, we need them and their ill intentioned creations off our shared spaces. This isn't anything new. This game has been playing out in different forms since eternity.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: