• @lepinkainen@lemmy.world
    link
    fedilink
    33 days ago

    The don’t “ChatGPT”, you can use local models and they cost next to nothing on Meta’s or Google’s scale. Both run their own servers for it.

    • stebo
      link
      fedilink
      22 days ago

      of course they don’t use chatgpt and whatever they use isn’t comparable to chatgpt cuz that would be unsustainable

      • @lepinkainen@lemmy.world
        link
        fedilink
        12 days ago

        “Isn’t comparable”? For generic tasks that’s true.

        Figuring out shittily censored words from pictures and subtitles? The custom models are even better