Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

  • @wewbull@feddit.uk
    link
    fedilink
    English
    727 days ago

    Honestly I think we need to understand that this is no different to sticking a photo of someone’s head on a porn magazine photo. It’s not real. It’s just less janky.

    I would categorise it as sexual harassment, not abuse. Still serious, but a different level

    • @LadyAutumn@lemmy.blahaj.zone
      link
      fedilink
      English
      37
      edit-2
      7 days ago

      Yes, finding out that your peers have been sharing deep fake pornography of you is absolutely fine and a normal thing for young girls to go through in school. No girls have ever killed themselves because of this exact sort of thing, surely. This definitely will not add in any way to the way women and girls are made to feel entirely disgustingly dehumanized by every man or boy in their lives. Groups of men and boys reducing them and their bodies down to vivid sexual fantasies that they can quickly generate photo realistic images of.

      If the person in the image is underaged then it should be classified as child pornography. If the woman who’s photo is being used hasnt consented to this then it should be classified as sexual exploitation.

      Women and girls have faced degrees of this kind of sexual exploitation by men and boys since the latter half of the 20th century. But this is a severe escalation in that behavior. It should be illegal to do this and it should be prosecuted when and where it is found to occur.

      • @FishFace@lemmy.world
        link
        fedilink
        English
        187 days ago

        It’s bullying with a sexual element. The fact that it uses AI or deepfakes is secondary, just as it was secondary when it was photoshop, just as it was secondary when it was cutting out photos. It’s always about using it bully someone.

        This is different because it’s easier. It’s not really different because it (can be) more realistic, because it was never about being realistic, otherwise blatantly unrealistic images wouldn’t have been used to do it. Indeed, the fact that it can be realistic will help blunt the impact of the leaking of real nudes.

        • @LadyAutumn@lemmy.blahaj.zone
          link
          fedilink
          English
          147 days ago

          It’s sexually objectifying the bodies of girls and turning them into shared sexual fantasies their male peers are engaging in. It is ABSOLUTELY different because it is more realistic. We are talking about entire deep fake porngraphy production and distribution groups IN THEIR OWN SCHOOLS. The amount of teenage boys cutting pictures out and photoshopping them was nowhere near as common as this is fast becoming and it was NOT the same as seeing a naked body algorithmically derived to appear as realistic as possible.

          Can you stop trying to find a silver lining in the sexual exploitation of teenage girls? You clearly don’t understand the kinds of long term psychological harm that is caused by being exploited in this way. It was also exploitative and also fucked up when it was in photoshop, this many orders of magnitude more sophisticated and accessible.

          Youre also wrong that this is about bullying. Its an introduction to girls being tools for male sexual gratification. It’s LITERALLY commodifiying teenage girls as sexual experiences and then sharing them in groups together. It’s criminal. The consent of the individual has been entirely erased. Dehumanization in its most direct form. It should be against the law and it should be prosecuted very seriously wherever it is found to occur.

          • @FishFace@lemmy.world
            link
            fedilink
            English
            87 days ago

            If a boy fantasises sexually about a girl, is that harmful to her? If he tells his friends about it? No, this is not harmful - these actions do not affect her in any way. What affects the girl is how the boys might then treat her differently than they would do someone they don’t find sexually attractive.

            The solution, in both cases, has to be to address the harmful behaviour. The only arguments for criminalising deepfakes themselves are also arguments for criminalising sexual fantasies. that is why people are talking about thought crime, because once you criminalise things that are harmless on their own, but which might down the line lead to directly harmful behaviour, there is no other distinction.

            The consent of the individual has been entirely erased. Dehumanization in its most direct form.

            Both of these, for example, apply just as readily to discussing a shared sexual fantasy about someone who didn’t agree to it.

            No distinction, that is, other than this is new and icky. I don’t want government policy to be dictated by fear of the new and by what people find icky, though. I do lots of stuff people find icky.

            • @LadyAutumn@lemmy.blahaj.zone
              link
              fedilink
              English
              15
              edit-2
              7 days ago

              No an image that is shared and distributed is not the same as a fantasy in someone’s head. That is deranged. Should CSAM also be legal because making it illegal is like criminalizing the fantasies of pedophiles? Absolutely insane logical framework you have there.

              This isnt fantasy. It is content. It is media. It is material. It is produced without the consent of the girls and women being sexualized and it commodifies their existence, literally transforming the idea of them into sexual media consumed for the gratification of boys and men.

              It is genuinely incredible to me that you could be so unempathetic, so impassive, so detached from the real world and the consequences of this, that you could even make this comparison. You have seemingly no idea what youre talking about if you believe that pornography is the same thing as mental fantasies.

              And even in the case of mental fantasies, are those all good? Is it really a good thing that boys see the mere existence of the girls around them as inherently some kind of sexual availability?

              • @FishFace@lemmy.world
                link
                fedilink
                English
                97 days ago

                When someone makes child porn they put a child in a sexual situation - which is something that we have amassed a pile of evidence is extremely harmful to the child.

                For all you have said - “without the consent” - “being sexualised” - “commodifies their existence” - you haven’t told us what the harm is. If you think those things are in and of themselves harmful then I need to know more about what you mean because:

                1. if someone thinks of me sexually without my consent I am not harmed
                2. if someone sexualises me in their mind I am not harmed
                3. I don’t know what the “commodification of one’s existence” can actually mean - I can’t buy or sell “the existence of women” (does buying something’s existence mean the same as buying the thing, or something else?) the same I can aluminium, and I don’t see how being able to (easily) make (realistic) nude images of someone changes this in any way

                It is genuinely incredible to me that you could be so unempathetic,

                I am not unempathetic, but I attribute the blame for what makes me feel bad about the situation is that girls are being made to feel bad and ashamed not that a particular technology is now being used in one step of that.

                • @LadyAutumn@lemmy.blahaj.zone
                  link
                  fedilink
                  English
                  8
                  edit-2
                  6 days ago

                  I am just genuinely speechless than you seemingly do not understand how sickening and invasive it is for your peers to create and share sexual content of you without your consent. Yes its extremely harmful. Its not a matter of feeling ashamed, its a matter of literally feeling like your value to the world is dictated by your role in the sexualities of heterosexual boys and men. It is feeling like your own body doesnt belong to you but can be freely claimed by others. It is losing trust in all your male friends and peers, because it feels like without you knowing they’ve already decided that you’re a sexual experience for them.

                  We do know the harm of this kind of sexualization. Women and girls have been talking about it for generations. This isnt new, just a new streamlined way to spread it. It should be illegal. It should be against the law to turn someone’s images into AI generated pornography. It should also be illegal to share those images with others.

                  • @FishFace@lemmy.world
                    link
                    fedilink
                    English
                    46 days ago

                    Its not a matter of feeling ashamed, its a matter of literally feeling like your value to the world is dictated by your role in the sexualities of heterosexual boys and men. It is feeling like your own body doesnt belong to you but can be freely claimed by others. It is losing trust in all your male friends and peers, because it feels like without you knowing they’ve already decided that you’re a sexual experience for them.

                    Why is it these things? Why does someone doing something with something which is not your body make it feel like your body doesn’t belong to you? Why does it not instead make it feel like images of your body don’t belong to you? Several of these things could equally be used to describe the situation when someone is fantasised about without their knowledge - why is that different? In Germany there’s a legal concept called “right to one’s own image” but there isn’t in many other countries, and besides, what you’re describing goes beyond this.

                    My thinking behind these questions is that I cannot see anything inherent, anything necessary about the creation of fake sexual images of someone which leads to these harms, and that instead there is an aspect of our society which very explicitly punishes and shames people - woman far more so than men - for being in this situation, and that without that, we would be having a very different conversation.

                    Starting from the position that the harm is in the creation of the images is like starting from the position that the harm of rape is in “defiling” the person raped. Rape isn’t wrong because it makes you worthless to society - society is wrong for devaluing rape victims. Society is wrong for devaluing and shaming those who have fake images made of them.

                    We do know the harm of this kind of sexualization. Women and girls have been talking about it for generations. This isnt new, just a new streamlined way to spread it. It should be illegal.

                    Can you be more explicit about what it’s the same as?

                • @atomicorange@lemmy.world
                  link
                  fedilink
                  English
                  66 days ago

                  Are you OK with sexually explicit photos of children taken without their knowledge? They’re not being actively put in a sexual situation if you’re snapping photos with a hidden camera in a locker room, for example. You ok with that?

                  The harm is:

                  • Those photos now exist in the world and can lead to direct harm to the victim by their exposure
                  • it normalizes pedophilia and creates a culture of trading images, leading to more abuse to meet demand for more images
                  • The people sharing those photos learn to treat people like objects for their sexual gratification, ignoring their consent and agency. They are more likely to mistreat people they have learned to objectify.
                  • your body should not be used for the profit or gratification of others without your consent. In my mind this includes taking or using your picture without your consent.
                  • @FishFace@lemmy.world
                    link
                    fedilink
                    English
                    26 days ago

                    Are you OK with sexually explicit photos of children taken without their knowledge? They’re not being actively put in a sexual situation if you’re snapping photos with a hidden camera in a locker room, for example. You ok with that?

                    No, but the harm certainly is not the same as CSAM and it should not be treated the same.

                    • it normalizes pedophilia and creates a culture of trading images, leading to more abuse to meet demand for more images
                    • The people sharing those photos learn to treat people like objects for their sexual gratification, ignoring their consent and agency. They are more likely to mistreat people they have learned to objectify.

                    as far as I know there is no good evidence that this is the case and is a big controversy in the topic of fake child porn, i.e. whether it leads to more child abuse (encouraging paedophiles) or less (gives them a safe outlet) or no change.

                    your body should not be used for the profit or gratification of others without your consent. In my mind this includes taking or using your picture without your consent.

                    If someone fantasises about me without my consent I do not give a shit, and I don’t think there’s any justification for it. I would give a shit if it affected me somehow (this is your first bullet point, but for a different situation, to be clear) but that’s different.

      • @General_Effort@lemmy.world
        link
        fedilink
        English
        137 days ago

        Historically, the respectability of a woman depended on her sexuality. In many conservative cultures and communities, that is still true. Spreading the message that deepfakes are some particular horrible form of harassment reinforces that view.

        If having your head on the model of a nude model is a terrible crime, then what does that say about the nude model? What does it say about women who simply happen to develop a larger bosom or lips? What does it say about sex before marriage?

        The implicit message here is simply harmful to girls and women.

        That doesn’t mean that we should tolerate harassment. But it needs to be understood that we can do no more to stop this kind of harassment than we can do to stop any other kind.

        • @LadyAutumn@lemmy.blahaj.zone
          link
          fedilink
          English
          167 days ago

          This is just apologia for the sexual commodification and exploitation of girls and women. There literally is no girl being sexually liberated here, she has literally had the choice taken from her. Sexual liberation does NOT mean “boys and men can turn all women into personal maturation aids”. This ENFORCES patriarchy and subjugation of women. It literally teaches girls that their bodies do not belong to them, that its totally understandable for boys to strip them of humanity itself and turn them into sex dolls.

          • @General_Effort@lemmy.world
            link
            fedilink
            English
            87 days ago

            The most deepfaked women are certainly actresses or musicians; attractive people that appear on screens and are known by much of the population.

            In some countries, they do not allow people to appear on-screen exactly because of that. Or at least, that’s one justification. If the honor or humanity of a woman depends on sexual feelings that she might or might not arouse in men, then women cannot be free. And men probably can’t be free either.

            At no point have I claimed that anyone is being liberated here. I do not know what will happen. I’m just pointing out how your message is harmful.

        • Lka1988
          link
          fedilink
          English
          36 days ago

          Spoken like someone who hasn’t been around women.

      • @atomicorange@lemmy.world
        link
        fedilink
        English
        107 days ago

        Thank you. Focusing on the harm the victims is the right way to understand this issue. Too many people in here hunting for a semantic loophole.

    • Hemingways_Shotgun
      link
      fedilink
      English
      97 days ago

      I’m not even going to begin describing all the ways that what you just said is fucked up.

      I’ll just point out that online deepfake technology is FAR more accessible to the average 13 year old to use on their peers than “porno mags” were in our day.

      You want to compare taking your 13 year old classmates photo off of Facebook, running it through an AI and in five seconds creating photo-realistic adult content featuring them, and compare that to getting your dad’s skin-mag from under his mattress when he’s not home, cutting your classmates face out of a yearbook, taping it on, then sneaking THAT into the computer lab at school so that you can photocopy it and pass it around in home room, and then putting the skin-mag BACK under the mattress before your dad finds out.

      Is that right…is THAT what you’re trying to say? Are those the two things that you’re trying say are equivalent?

      • @SheeEttin@lemmy.zip
        link
        fedilink
        English
        147 days ago

        Yes, we all know it’s fucked up. The point is that we don’t need a new class of laws just because it’s harassment and bullying ✨with AI✨.

    • lurch (he/him)
      link
      fedilink
      English
      9
      edit-2
      7 days ago

      I hope it might lead to a situation of dirty pics/vids not being a problem for the people in it any more, as it could be a deepfake. Like there were cases where a surfacing dirty pic was used for blackmail, ruined someones career or got them kicked out of some comittee, but since it could be fabrication now, I hope this will beva thing of the past, soon.

      • @wewbull@feddit.uk
        link
        fedilink
        English
        127 days ago

        That could be a socially healthy place to end up at. I don’t see it anytime soon though. Just look at the other response I got.

        • Hemingways_Shotgun
          link
          fedilink
          English
          87 days ago

          Sure. That might end up being a socially healthy place for adults to end up.

          But it will never work that way for young teens. Their brains aren’t done baking yet. They don’t have the emotional maturity to understand that enough to be “okay with it because it’s just a fake”.

          That’s why we protect kids rather than just telling them “hey it’s okay…it’s only a fake.”

        • BombOmOm
          link
          fedilink
          English
          47 days ago

          Anyone with half a brain will certainly claim as much. Even if people don’t fully believe it, it will blunt the most serious of social consequences.

    • SharkAttak
      link
      fedilink
      37 days ago

      Furthermore, we generally assume malicious intent, but I wouldn’t be surprised if teenagers were using the app to ‘get’ big boobs etc., we all have seen those shopped pictures with deformed background 😁