• @MonkderVierte@lemmy.ml
    link
    fedilink
    English
    215 days ago

    Btw, how about limiting clicks per second/minute, against distributed scraping? A user who clicks more than 3 links per second is not a person. Neither, if they do 50 in a minute. And if they are then blocked and switch to the next, it’s still limited in bandwith they can occupy.

    • @letsgo@lemm.ee
      link
      fedilink
      English
      115 days ago

      I click links frequently and I’m not a web crawler. Example: get search results, open several likely looking possibilities (only takes a few seconds), then look through each one for a reasonable understanding of the subject that isn’t limited to one person’s bias and/or mistakes. It’s not just search results; I do this on Lemmy too, and when I’m shopping.

      • @MonkderVierte@lemmy.ml
        link
        fedilink
        English
        95 days ago

        Ok, same, make it 5 or 10. Since i use Tree Style Tabs and Auto Tab Discard, i do get a temporary block in some webshops, if i load (not just open) too much tabs in too short time. Probably a CDN thing.

    • Jade
      link
      fedilink
      English
      75 days ago

      They make one request per IP. Rate limit per IP does nothing.

      • @MonkderVierte@lemmy.ml
        link
        fedilink
        English
        3
        edit-2
        5 days ago

        Ah, one request, then the next IP doing one and so on, rotating? I mean, they don’t have unlimited adresses. Is there no way to group them together to a observable group, to set quotas? I mean, in the purpose of defense against AI-DDOS and not just for hurting them.

        • @edinbruh@feddit.it
          link
          fedilink
          English
          65 days ago

          There’s always Anubis 🤷

          Anyway, what if they are backed by some big Chinese corporation with some /32 ipv6 and some /16 ipv4? It’s not that unreasonable