Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is also irrelevant to the original comment which is complaining about bot checks for looking at the root of the repositiory - which is probably the highest requested resource and should be 100% served from cache with a cost much less than running the bot checks.

It's simply bad, inefficient software and we shouldn't keep making excuses for it.

 help



Agree. Did some basic searching and looks like Gitlab is particularly bad. It ships with built in rate-limiting but the backend marks all pages as uncacheable on top of them being somewhat dynamically generated (I guess it caches "page fragments").

The only issues I found amounted to "here's how to use Anubis to block everything"

There's also some new but poorly supported standards around agents setting `Accept: text/markdown` and https://github.com/cloudflare/web-bot-auth




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: