The moderators of a pro-artificial intelligence Reddit community announced that they have been quietly banning “a bunch of schizoposters” who believe “they’ve made some sort of incredible discovery or created a god or become a god,” highlighting a new type of chatbot-fueled delusion that started getting attention in early May.

  • @Skunk@jlai.lu
    link
    fedilink
    English
    541 month ago

    Yeah, there’s been an article shared on lemmy a few months ago about couples or families destroyed by AI.

    Like the husband thinks he discovered some new truth, kinda religious level about how the world is working and stuff. The he becomes an annoying guru and ruins his social life.

    Kind of Qanon people but with chatGPT…

    • Vanth
      link
      fedilink
      English
      211 month ago

      This feels a bit like PTA-driven panic about kids eating Tide Pods when like one person did it. Or razor blades in Halloween candy. Or kids making toilet hooch with their juice boxes. Or the choking game sweeping playgrounds.

      But also, man on internet with no sense of mental health … sounds almost feasible.

      • @Pogogunner@sopuli.xyz
        link
        fedilink
        English
        191 month ago

        I directly work with one of these people - they admit to spending all of their free time talking to the LLM chatbots.

        On our work forums, I see it’s not uncommon at all. If it makes you feel any better, AI loving is highly correlated with people you shouldn’t ever listen to in the first place.

      • chaosCruiser
        link
        fedilink
        English
        121 month ago

        The Internet is a pretty big place. There’s no such thing as an idea that is too stupid. There’s always at least a few people who will turn that idea into a central tenet of their life. It could be too stupid for 99.999% of the population, but that still leaves about 5 000 people who are totally into it.

    • Raltoid
      link
      fedilink
      English
      131 month ago

      And that’s not even getting started on “ai girlfriends”, that are isolating vulnerable people to a terrifying degree. And since they are garbage at context, they do things like that case last year where it could seem like it was encouraging a suicidal teen.