• onlinepersona@programming.dev
    link
    fedilink
    arrow-up
    21
    arrow-down
    3
    ·
    6 hours ago

    I see the “just create an account” and “just login” crowd have joined the discussion. Some people will defend a monopolist no matter what. If github introduced ID checks à la Google or required a Microsoft account to login, they’d just shrug and go “create a Microsoft account then, stop bitching”. They don’t realise they are being boiled and don’t care. Consoomer behaviour.

    Anti Commercial-AI license

  • daniskarma@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    7
    arrow-down
    2
    ·
    5 hours ago

    Open source repositories should rely on p2p. Torrenting repos is the way I think.

    Not only for this. At any point m$ could take down your repo if they or their investors don’t like it.

    I wonder if it would already exist and if it could work with git?

    • thenextguy@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      2 hours ago

      Git is p2p and distributed from day 1. Github is just a convenient website. If Microsoft takes down your repo, just upload to another system. Nothing but convenience will be lost.

    • samc@feddit.uk
      link
      fedilink
      English
      arrow-up
      5
      ·
      5 hours ago

      The project’s official repo should probably exist in a single location so that there is an authoritative version. At that point p2p is only necessary if traffic for the source code is getting too expensive for the project.

      Personally I think the source hut model is closest to the ideal set up for OSS projects. Though I use Codeberg for my personal stuff because I’m cheap and lazy

      • daniskarma@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        2
        ·
        5 hours ago

        I’m wary of external dependencies. They are cool now, but will they be cool in the future? Will they even exist?

        One thing I think p2p excels is resiliance. People be still using eDonkey even if it’s abandoned.

        A repo signature should deal with “fake copies”. It’s true we have the problem that BitTorrent protocol is not though for updating files, so a different protocol would be needed. I don’t even know how possible/practical it is. It’s true that any big project should probably host their own remote repo, and copy it on other platforms as needed. Github only repos was always a dangerous practice.

        • samc@feddit.uk
          link
          fedilink
          English
          arrow-up
          2
          ·
          3 hours ago

          If you’re able to easily migrate issues etc to a new instance, then you don’t need to worry about a particular service providers getting shitty. At which point your main concern is temporary outages.

          Perhaps this is more of a concern for some projects (e.g. anything that angers Nintendo’s lawyers). But for most, I imagine that the added complexity of distributed p2p hosting would outweigh the upsides.

          Not saying it’s a bad idea, in fact I like it a lot, but I can see why it’s not a high priority for most OSS devs

      • daniskarma@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        4 hours ago

        I’ve been reading about it. But at some point I found that the parent organization run a crypto scam. Supposedly is not embedded into the protocol but they also said that the token is used to give rewards withing the protocol. That just made me wary of them.

        Though the protocol did seen interesting. It’s MIT licensed I think so I suppose it could just be forked into something crypto free.

        • onlinepersona@programming.dev
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          40 minutes ago

          There’s nothing crypto in the radicle protocol. What I think you’re referring to are “drips” which uses crypto to fund opensource development (I know how terrible). It’s its own protocol built on top of ethereum and is not built into the radicle protocol.

          This comes up every time someone mentions radicle and I think it happens because there’s a RAD crypto token and a radicle protocol. Beyond the similar names, it’s like mistaking bees for wasps because they look similar and not bothering to have a closer look.

          Drips are funding the development of gitoxide, BTW, which is a Rust reimplementation of git. I wouldn’t start getting suspicious of gitoxide sneaking in a crypto protocol just because it’s funded by crypto. If we attacked everything funded by the things we consider evil, well everything opensource made by GAFAM would have to go: modern video streaming (HLS by Apple), Android (bought by Google), LSPs (popularised and developed by Microsoft), OBS (sponsored by Google through YouTube and by Amazon through Twitch), and much much more.

          Anti Commercial-AI license

          • daniskarma@lemmy.dbzer0.com
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            34 minutes ago

            The thing is that the purpose of such a system is to run away from enshitificacion.

            If they are so crypto adjacent is like a enshitificacion speedrun.

            If I’m going to stay in a platform that just care for the money I might as well stay in corpo platforms. I’m not going to the trouble of changing platform and using new systems to keep getting being used so others can enrich.

            Git itself doesn’t have crypto around it. This shouldn’t have either.

            And this is not even against crypto as a concept, which is fine by me. It’s against using crypto as a scam to get a quick buck out of people who doesn’t know better.

    • Kuinox@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      5 hours ago

      Torrenting doesn’t deal well with updating files.
      And you have another problem: how do you handle bad actors spamming the download ?
      That’s probably why github does that.

      • daniskarma@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        edit-2
        5 hours ago

        That’s true. I didn’t think of that.

        IPFS supposedly works fine with updating shares. But I don’t want to get closer to that project as they had fallen into cryptoscam territory.

        I’m currently reading about “radicle” let’s see what the propose.

        I don’t get the bad actors spamming the download. Like downloading too much? Torrent leechers?

        EDIT: Just finished by search sbout radicle. They of course have relations with a cryptomscam. Obviously… ;_; why this keep happening?

  • John Richard@lemmy.world
    link
    fedilink
    English
    arrow-up
    25
    arrow-down
    4
    ·
    14 hours ago

    Crazy how many people think this is okay, yet left Reddit cause of their API shenanigans. GitHub is already halfway to requiring signing in to view anything like Twitter (X).

  • tal@lemmy.today
    link
    fedilink
    English
    arrow-up
    45
    arrow-down
    1
    ·
    16 hours ago

    60 req/hour for unauthenticated users

    That’s low enough that it may cause problems for a lot of infrastructure. Like, I’m pretty sure that the MELPA emacs package repository builds out of git, and a lot of that is on github.

    • Xanza@lemm.ee
      link
      fedilink
      English
      arrow-up
      26
      ·
      edit-2
      14 hours ago

      That’s low enough that it may cause problems for a lot of infrastructure.

      Likely the point. If you need more, get an API key.

    • NotSteve_@lemmy.ca
      link
      fedilink
      arrow-up
      13
      arrow-down
      3
      ·
      15 hours ago

      Do you think any infrastructure is pulling that often while unauthenticated? It seems like an easy fix either way (in my admittedly non devops opinion)

      • Ephera@lemmy.ml
        link
        fedilink
        English
        arrow-up
        10
        ·
        11 hours ago

        It’s gonna be problematic in particular for organisations with larger offices. If you’ve got hundreds of devs/sysadmins under the same public IP address, those 60 requests/hour are shared between them.

        Basically, I expect unauthenticated pulls to not anymore be possible at my day job, which means repos hosted on GitHub become a pain.

        • timbuck2themoon@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 hours ago

          Quite frankly, companies shouldn’t be pulling Willy nilly from github or npm, etc anyway. It’s trivial to set up something to cache repos or artifacts, etc. Plus it guards against being down when github is down, etc.

        • NotSteve_@lemmy.ca
          link
          fedilink
          arrow-up
          1
          ·
          3 hours ago

          Ah yeah that’s right, I didn’t consider large offices. I can definitely see how that’d be a problem

      • Boomer Humor Doomergod@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        14 hours ago

        If I’m using Ansible or something to pull images it might get that high.

        Of course the fix is to pull it once and copy the files over, but I could see this breaking prod for folks who didn’t write it that way in the first place

    • Xanza@lemm.ee
      link
      fedilink
      English
      arrow-up
      22
      arrow-down
      7
      ·
      14 hours ago

      Until there will be.

      I think people are grossly underestimating the sheer size and significance of the issue at hand. Forgejo will very likely eventually get to the same point Github is at right now, and will have to employ some of the same safeguards.

      • FlexibleToast@lemmy.world
        link
        fedilink
        English
        arrow-up
        25
        arrow-down
        5
        ·
        14 hours ago

        Except Forgejo is open source and you can run your own instance of it. I do, and it’s great.

        • Xanza@lemm.ee
          link
          fedilink
          English
          arrow-up
          9
          arrow-down
          5
          ·
          11 hours ago

          That’s a very accurate statement which has absolutely nothing to do with what I’ve said. Fact of the matter stands, is that those who generally seek to use a Github alternative do so because they dislike Microsoft or closed source platforms. Which is great, but those platforms with hosted instances see an overwhelmingly significant portion of users who visit because they choose not to selfhost. It’s a lifecycle.

          1. Create cool software for free
          2. Cool software gets popular
          3. Release new features and improve free software
          4. Lots of users use your cool software
          5. Running software becomes expensive, monetize
          6. Software becomes even more popular, single stream monetization no longer possible
          7. Monetize more
          8. Get more popular
          9. Monetize more

          By step 30 you’re selling everyone’s data and pushing resource restrictions because it’s expensive to run a popular service that’s generally free. That doesn’t change simply because people can selfhost if they want.

          • FlexibleToast@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            5 hours ago

            To me, this reads strongly like someone who is confidently incorrect. Your starting premise is incorrect. You are claiming Forgejo will do this. Forgejo is nothing but an open source project designed to self host. If you were making this claim about Codeberg, the project’s hosted version, then your starting premise would be correct. Obviously, they monetize Codeberg because they’re providing a service. That monetization feeds Forgejo development. They could also sell official support for people hosting their own instances of Forgejo. This is a very common thing that open source companies do…

  • theunknownmuncher@lemmy.world
    link
    fedilink
    arrow-up
    29
    arrow-down
    4
    ·
    edit-2
    15 hours ago

    LOL!!! RIP GitHub

    EDIT: trying to compile any projects from source that use git submodules will be interesting. eg ROCm has more than 60 submodules to pull in 💀

    • The Go module system pulls dependencies from their sources. This should be interesting.

      Even if you host your project on a different provider, many libraries are on github. All those unauthenticated Arch users trying to install Go-based software that pulls dependencies from github.

      How does the Rust module system work? How does pip?

      • UnityDevice@lemmy.zip
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        6 hours ago

        Compiling any larger go application would hit this limit almost immediately. For example, podman is written in go and has around 70 dependencies, or about 200 when including transitive dependencies. Not all the depends are hosted on GitHub, but the vast majority are. That means that with a limit of 60 request per hour it would take you 3 hours to build podman on a new machine.

      • Ephera@lemmy.ml
        link
        fedilink
        English
        arrow-up
        6
        ·
        11 hours ago

        For Rust, as I understand, crates.io hosts a copy of the source code. It is possible to specify a Git repository directly as a dependency, but apparently, you cannot do that if you publish to crates.io.

        So, it will cause pain for some devs, but the ecosystem at large shouldn’t implode.

      • adarza@lemmy.ca
        link
        fedilink
        English
        arrow-up
        12
        ·
        14 hours ago

        already not looking forward to the next updates on a few systems.