Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Rate limit according to what? It was 35k residential IPs. Rate limit would end up keeping real users out.


Rate limit according to destination URL (the expensive ones), not source IP.

If you have expensive URLs that you can't serve more than, say 3 of at a time, or 100 of per minute, NOT rate limiting them will end up keeping real users out simply because of the lack of resources.


Right - but if you have, say, 1000 real user requests for those endpoints daily, and thirty million bot requests for those endpoints, the practical upshot of this approach is that none of the real users get to access that endpoint.


Yeah, at that point to might as well just turn off the servers. It's even cheaper at cutting off requests, and it'll serve just as many legitimate users.


No, it's not equal. These URLs might not be critical for users — they can still browse other parts of the site. If rate limiting is implemented for, let’s say, 3% of URLs, then 97% of the website will still be usable during a DoS attack.


Right, but in terms of users ability to access those 3%, you might as well disable those endpoints entirely instead of rate limiting - much easier to implement, and has essentially the same effect on the availability of the endpoints to users.


this feels like something /you can do on your servers/, and that other folks with resource constraints (like time, budget, or the hardware they have) find anubis valuable.


Sure, didn't mean to imply Anubis wasn't an alternative, just was clearing up that there are options beyond source IP rate limiting, which several people seemed to be thinking was the only option because of comments about rate limiting not working because it was coming from 35K IP addresses.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: