POLL, please comment if you have an opinon:
Codeberg.org is being spammed by users using one-time/disposable email services and TOR connections. These spam projects with thousands of bogus issue comments, cause pain for project owners, and spam their notification email inbox. Also, Codeberg's SMTP reputation is harmed.
We consider disabling access via TOR and one-time email providers to maintain smooth operation for all users.
What do you think? Is there a better approach?
Please have your say.
@codeberg technically, an API could require a proof-of-work. I'm not aware of existing libraries or implementations.
The client (including the web-version) would then need to 'mine' some hashes before submitting a request(I.e. hashcash). Acting as captcha for API and web. Costing a normal user tiny amounts of electricity and delay, but bots large amounts of resources.
And if those hashes then bring in some micropayments, its a win-win.
@codeberg to clarify: each request needs such a PoW in a header or as part of the payload.
But this sounds like a big project on its own. Maybe others have built this already? Could be in the form of a HTTP proxy even.
@berkes It would be really interesting to build hashcash using a more modern PoW like Cuckoo Cycle and actually implement it in Gitea as DoS prevention.
For APIs it's a bit tricky to require your users to implement it themselves. If Gitea has client libs, it could be done.
@codeberg how about plain rate limiting? Like one request per 5 seconds.
Where Z > X, Z > Y. And Y > X.
For rails, I always use https://github.com/jeremy/rack-ratelimit. There might be something in go, that can be integrated in gitea. Or agnostic proxy that is as flexible and tunable.
Though a proxy has no knowledge of things like 'user' or 'customer'.
@codeberg also note that typically, rate limiting for GET request can be an order of magnitute more lenient than for PUT/patch/POST/DELETE.
E.g. where you say: per IP we allow 100.000 read (GET) requests per hour, but only 100 writes (post etc ) per hour.
@stevenroose I'd say you could implement it as a generic HTTP proxy. Making it a language- and application agnostic API protection. Probably even poosible as SaaS.
Clients would need to implement though. But it could even be configured progressive, to remain backwards compatible. As in '5 requests per minute allowed without PoW-header, unlimited with such a header'.
Now, if only someone with more time liked this concept as much as I do...
@berkes I know the feeling. The biggest issue with something like hashcash is working out a spec. Because it's a very standard-sensitive thing. You don't want every website to go take an entirely different PoW etc so that you have to go program a miner every time you want to use a new service's API.
@berkes "Costing a normal user tiny amounts of electricity and delay, but bots large amounts of resources." A swarm of bot is based on piracy, they don't care about electricity because they don't pay it. @codeberg I suggest asking project owners to moderate issues before these issues are made public. And flag projects that authorize too much issues for manual inspection.
The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!