Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The DuckDuckGo approach of just blacklisting a few of the larger and more egregious content farms seems like a decent band-aid. It's not a real fix, but it cuts out a large number of the bad cases, since at least for me, a handful of large content farms account for most of the times when I've accidentally clicked on a Google result and then realized "oh ugh, it's one of these sites". Spammy blogs are the other big problem, but there's no easy band-aid for that one.


The DuckDuckGo approach of just blacklisting a few of the larger and more egregious content farms seems like a decent band-aid.

I really like that they take a pro active approach to issues. It was one of the main reasons I switched.

Spammy blogs are the other big problem, but there's no easy band-aid for that one.

A lot of "spammy blogs" are owned and operated by the same people. The whois data is usually enough to tip you off. I do not see why some spiders do not do a whois query and black list the owner/company; at least for a set period of time, and allow appeals.


> just blacklisting a few of the larger and more egregious content farms seems like a decent band-aid.

The real band aid would be to enable users to blacklist sites that they don't want to see. I often search for things and come across the same spam sites that I do not want, over and over.

After that a distributed trust model should be built so that users can share blacklists of spam sites.


Click settings on Google News and choose the sites you want less content from.

Blacklists are awesome...but no one but "us" knows how to use them.


It is not google news but sites which aggregate search terms. A good example is www.eudict.com which repeatedly puts up other peoples' search terms as real words.

(If you are searching for some words, you are flooded with EUDicks search spam)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: