• HedyL
    link
    fedilink
    English
    arrow-up
    12
    ·
    1 month ago

    What happened to good old dice?

    • BlueMonday1984
      link
      fedilink
      English
      arrow-up
      9
      ·
      1 month ago

      LLMs are designed to make plausible-looking text, so whatever bullshit statistics they extrude will be more convincing to the untrained eye.

    • eRac@lemmings.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      11
      ·
      1 month ago

      Dice are more random, too random for believable results. The nice thing with an LLM is that it knows the real-world probability of every answer to a question, or can make a realistic inference of what that probability will be.

      It won’t give actual feedback about your thing, but it will give feedback that appears real.

      • swlabr
        link
        fedilink
        English
        arrow-up
        12
        ·
        1 month ago

        It’s all true, I made millions of dollars by using ChatGPT to place bets on the ponies. I started with billions

      • Sunsofold@lemmings.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        1 month ago

        LLMs don’t ‘know’ anything. They are a million dice rolled at once and run through an equation that turns them into the appearance of knowledge. The d20 does not know the numbers one through twenty. It only displays an element of itself to you in a way that lets you perceive the number.