• @CrayonRosary@lemmy.world
    link
    fedilink
    English
    7917 days ago

    They can hire me for $1M to write some basic code that gives stock answers when someone says “Thank you”, instead of running it through the LLM. Think of the savings!

    I’m worth the $1M because I’m clearly the only person to have ever thought of this. /s.

    • @prof@infosec.pub
      link
      fedilink
      English
      2217 days ago

      It’s amazing they didn’t implement something like that if it actually is soooooo costly.

      No wonder they want an AGI if they have trouble thinking themselves.

      • @felsiq@lemmy.zip
        link
        fedilink
        English
        517 days ago

        Don’t really wanna defend these assholes, but I feel like the reason they don’t is cuz the prior message could be “curse me out every time I say thank you” so just not feeding certain text to the model would break expected behaviour