• @daniskarma@lemmy.dbzer0.com
    link
    fedilink
    203 days ago

    Open source repositories should rely on p2p. Torrenting repos is the way I think.

    Not only for this. At any point m$ could take down your repo if they or their investors don’t like it.

    I wonder if it would already exist and if it could work with git?

    • @thenextguy@lemmy.world
      link
      fedilink
      163 days ago

      Git is p2p and distributed from day 1. Github is just a convenient website. If Microsoft takes down your repo, just upload to another system. Nothing but convenience will be lost.

    • @samc@feddit.uk
      link
      fedilink
      English
      93 days ago

      The project’s official repo should probably exist in a single location so that there is an authoritative version. At that point p2p is only necessary if traffic for the source code is getting too expensive for the project.

      Personally I think the source hut model is closest to the ideal set up for OSS projects. Though I use Codeberg for my personal stuff because I’m cheap and lazy

      • I’m wary of external dependencies. They are cool now, but will they be cool in the future? Will they even exist?

        One thing I think p2p excels is resiliance. People be still using eDonkey even if it’s abandoned.

        A repo signature should deal with “fake copies”. It’s true we have the problem that BitTorrent protocol is not though for updating files, so a different protocol would be needed. I don’t even know how possible/practical it is. It’s true that any big project should probably host their own remote repo, and copy it on other platforms as needed. Github only repos was always a dangerous practice.

        • @Revan343@lemmy.ca
          link
          fedilink
          33 days ago

          It’s true we have the problem that BitTorrent protocol is not though for updating files

          Bittorrent v2 has updatable torrents

        • @samc@feddit.uk
          link
          fedilink
          English
          23 days ago

          If you’re able to easily migrate issues etc to a new instance, then you don’t need to worry about a particular service providers getting shitty. At which point your main concern is temporary outages.

          Perhaps this is more of a concern for some projects (e.g. anything that angers Nintendo’s lawyers). But for most, I imagine that the added complexity of distributed p2p hosting would outweigh the upsides.

          Not saying it’s a bad idea, in fact I like it a lot, but I can see why it’s not a high priority for most OSS devs

    • @Kuinox@lemmy.world
      link
      fedilink
      33 days ago

      Torrenting doesn’t deal well with updating files.
      And you have another problem: how do you handle bad actors spamming the download ?
      That’s probably why github does that.

      • @daniskarma@lemmy.dbzer0.com
        link
        fedilink
        3
        edit-2
        3 days ago

        That’s true. I didn’t think of that.

        IPFS supposedly works fine with updating shares. But I don’t want to get closer to that project as they had fallen into cryptoscam territory.

        I’m currently reading about “radicle” let’s see what the propose.

        I don’t get the bad actors spamming the download. Like downloading too much? Torrent leechers?

        EDIT: Just finished by search sbout radicle. They of course have relations with a cryptomscam. Obviously… ;_; why this keep happening?