Title says it all

  • @chatokun@lemmy.dbzer0.com
    link
    fedilink
    106 days ago

    My sister caught her 8 year old son talking to ai chat bots on a software like this and blocked it. She went through the history and said it was often trying to flirt with him, but he didn’t seem to be interested,and seemed to more just be looking to talk.

    This may be an aim to get young kids, though I’m definitely not saying the pedo vibes aren’t intentional. I just think they’re going for more than one audience group.

  • @jsomae@lemmy.ml
    link
    fedilink
    478 days ago

    there’s plausible denia… nah i got nothin. That’s messed up. Even for the most mundane, non-gross use case imaginable, why the fuck would anybody need a creepy digital facsimile of a child?

    • @ckmnstr@lemmy.worldOP
      link
      fedilink
      278 days ago

      I mean, maaaybe if you wanted children and couldn’t have them. But why would it need to be “beautiful and up for anything”?

      • @jsomae@lemmy.ml
        link
        fedilink
        88 days ago

        “beautiful and up for anything” is incredibly suggestive phrasing. It’s an exercise in mental creativity to make it sound not creepy. But I can imagine a pleasant grandma (always the peak of moral virtue in any thought experiment) saying this about her granddaughter. I don’t mean to say I have heard this, only that I can imagine it. Barely.

  • @Ceedoestrees@lemmy.world
    link
    fedilink
    418 days ago

    Yep. I dick around on a similar platform because a friend built it.

    The amount of shit I’ve reported is insane. Pedos just keep coming back with new accounts. Even with warnings and banned words, they find a way.

  • @TheFriar@lemm.eeM
    link
    fedilink
    338 days ago

    If I were you I’d send this to some media outlets. Tank some AI stock and create some more negative news around it.

  • And the bot has 882.9k chats.

    Im not surprised and I dont think you or anyone else is either. But that doesn’t make this less disturbing.

    Im sure thw app devs are not interested in cutting off a huge chunk of their loyal users by doing the right thing and getting rid of those types of bots.

    Yes, its messed up. In my experience, it is difficult to report chat bots and see any real action taken as a result.

    • Shin
      link
      fedilink
      English
      118 days ago

      Ehhh nah. As someone who used character.ai before there are many horrible bots that get cleared and the bots have been impossible to have sex with unless you get really creative. The most horrendous ones get removed quite a bit and were consistently reposted. I’m not here to shield a big company or anything, but the “no sex” thing was a huge thing in the community and they always fought with the devs about it.

      They’re probably trying to hide behind the veil of more normal bots now, but I struggle to imagine how they’d get it to do sexual acts, when some lightly violent RPs I tried to do got censored. It’s pretty difficult, and got worse over time. Idk though, I stopped using it a while ago.

    • snooggums
      link
      fedilink
      English
      358 days ago

      Just a friendly childlike free spririt ready to talk about girl stuff!

      /s for real though, it is totally the evil thing

  • @ZDL@lazysoci.al
    link
    fedilink
    178 days ago

    Yes it’s what you think it is. I don’t think, however, that there is anywhere to report it that will care enough to do something about it.

  • @Lyra_Lycan@lemmy.blahaj.zone
    link
    fedilink
    English
    168 days ago

    I’ve got a couple ads for an AI chat on Android, can’t remember the name but it has a disclaimer onscreen that reads something like “All characters shown are in their grown-up form”, implying that there are teen or child forms that you can communicate with.

    • @ckmnstr@lemmy.worldOP
      link
      fedilink
      78 days ago

      I saw something similar! Reported it to Google ads and of course they “couldn’t find any ToS violations”

      • 𝕸𝖔𝖘𝖘
        link
        fedilink
        6
        edit-2
        8 days ago

        As long as they get paid, there’s no TOS violation. Bloody wankers

        Edit: changed a word to make it less vile

    • dil
      link
      fedilink
      68 days ago

      nah that likely implies that the children form was rejected by censors so its an adult version now

  • you_are_dust
    link
    fedilink
    158 days ago

    I’ve messed around with some of these apps out of curiosity of where the technology is. There’s typically a report function in the app. You can probably report that particular bot from within the app to try and get that bot deleted. Reporting the app itself probably won’t do much.

    • They definitely knew who they were targeting when they made this. I only hope that, if those predators simply must text with a child, they keep talking to an ai bot rather than a real child.

    • why is that unfortunate though? who would you be protecting by making that chatbot illegal? would you “protect” the chatbot? would you “protect” the good-think of the users? do you think it’s about preventing “normalization” of these thoughts?

      in case of the latter: we had the very same discussion with shooter-video-games and evidence shows that shooter games do not make people more violent or likely to kill with guns and other weapons.

      • @zalgotext@sh.itjust.works
        link
        fedilink
        28 days ago

        I don’t think it’s the same discussion, video games and AI chatbots are two very different things that you engage with in very different ways.