Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If you don't have an "officially recognized" UA string some websites will assume you're a scraper/crawler/bot/ and rightfully so.


If I was ever trying to scrape a website that blocked my user agent I would just set the user agent to chrome on windows. Every library I have used supports setting whatever UA you want.


That's just a heuristic of the times. Soon bots will be much smarter than that.


They already are.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: