• @SparroHawc@lemmy.zip
    link
    fedilink
    53 days ago

    No. Artificial Intelligence has to be imitating intelligent behavior - such as the ghosts imitating how, ostensibly, a ghost trapped in a maze and hungry for yellow circular flesh would behave, and how CS1.6 bots imitate the behavior of intelligent players. They artificially reproduce intelligent behavior.

    Which means LLMs are very much AI. They are not, however, AGI.

    • @SoftestSapphic@lemmy.world
      link
      fedilink
      43 days ago

      No, the logic for a Pac Man ghost is a solid state machine

      Stupid people attributing intelligence to something that is probably not is a shameful hill to die on.

      Your god is just an autocomplete bot that you refuse to learn about outside the hype bubble

      • @SparroHawc@lemmy.zip
        link
        fedilink
        2
        edit-2
        2 days ago

        Okay, what is your definition of AI then, if nothing burned onto silicon can count?

        If LLMs aren’t AI, then absolutely nothing up to this point probably counts either.

        • @SoftestSapphic@lemmy.world
          link
          fedilink
          1
          edit-2
          2 days ago

          since nothing burned into silicon can count

          Oh noo you called me a robot racist. Lol fuck off dude you know that’s not what I’m saying

          The problem with supporters of AI is they learned everything they know from the companies trying to sell it to them. Like a 50s mom excited about her magic tupperware.

          AI implies intelligence

          To me that means an autonomous being that understands what it is.

          First of all these programs aren’t autonomous, they need to be seeded by us. We send a prompt or question, even when left alone to its own devices it doesn’t do anything until it is given an objective or reward by us.

          Looking up the most common answer isn’t intelligence, there is no understanding of cause and effect going on inside the algorithm, just regurgitating the dataset

          These models do not reason, though some do a very good job of trying to convince us.

          • @howrar@lemmy.ca
            link
            fedilink
            12 days ago

            To me that means an autonomous being that understands what it is.

            A little thought experiment: How would you determine whether another human being understands what it is? What would that look like in a machine?

      • @howrar@lemmy.ca
        link
        fedilink
        1
        edit-2
        2 days ago

        As far as I’m concerned, “intelligence” in the context of AI basically just means the ability to do things that we consider to be difficult. It’s both very hand-wavy and a constantly moving goalpost. So a hypothetical pacman ghost is intelligent before we’ve figured out how to do it. After it’s been figured out and implemented, it ceases to be intelligent but we continue to call it intelligent for historical reasons.

      • Okay but if i say something from outside the hype bubble then all my friends except chatgpt will go away.

        Also chatgpt is my friend and always will be, and it even told me i don’t have to take the psych meds that give me tummy aches!