A mother and her 14-year-old daughter are advocating for better protections for victims after AI-generated nude images of the teen and other female classmates were circulated at a high school in New Jersey.

Meanwhile, on the other side of the country, officials are investigating an incident involving a teenage boy who allegedly used artificial intelligence to create and distribute similar images of other students – also teen girls - that attend a high school in suburban Seattle, Washington.

The disturbing cases have put a spotlight yet again on explicit AI-generated material that overwhelmingly harms women and children and is booming online at an unprecedented rate. According to an analysis by independent researcher Genevieve Oh that was shared with The Associated Press, more than 143,000 new deepfake videos were posted online this year, which surpasses every other year combined.

  • Marxism-Fennekinism@lemmy.ml
    link
    fedilink
    English
    arrow-up
    43
    arrow-down
    1
    ·
    edit-2
    8 months ago

    Maybe I’m just naive of how many protections we’re actually granted but shouldn’t this already fall under CP/CSAM legislation in nearly every country?

          • Basil@lemmings.world
            link
            fedilink
            arrow-up
            10
            arrow-down
            2
            ·
            edit-2
            8 months ago

            What? But they literally do exist, and they’re hurting from it. Did you even read the post?

          • Nyanix@lemmy.ca
            link
            fedilink
            arrow-up
            6
            ·
            8 months ago

            While you’re correct, many of these generators are retaining the source image and only generating masked sections, so the person in the image is still themselves with effectively photoshopped nudity, which would still qualify as child pornography. That is an interesting point that you make though

          • drislands@lemmy.world
            link
            fedilink
            arrow-up
            5
            ·
            8 months ago

            The article is about real children being used as the basis for AI-generated porn. This isn’t about entirely fabricated images.

          • DogMuffins@discuss.tchncs.de
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            5
            ·
            8 months ago

            Of course they exist. If the AI generated image “depicts” a person, a victim in this case, that person “by definition” exists.

            Your argument evaporates when you consider that all digital images are interpreted and encoded by complex mathematical algorithms. All digital images are “fake” by that definition and therefore the people depicted do not exist. Try explaining that to your 9 year old daughter.

          • DogMuffins@discuss.tchncs.de
            link
            fedilink
            English
            arrow-up
            1
            ·
            8 months ago

            IDK why this dumb thought experiment makes me so grumpy everyone someone invokes it, but you’re going to have to explain how it’s relevant here.

            • Lemming6969@lemmy.world
              link
              fedilink
              arrow-up
              1
              ·
              edit-2
              7 months ago

              How many pieces do you have to change before it’s not closely enough related? If every piece is modified, is it the same base image? If it’s not the same image, when does it cease to represent the original and must be reassessed? If it’s no longer the image of a real person, given the extreme variety in both real and imagined people, how can an AI image ever be illegal? If you morph between an image of a horse and an illegal image, at what exact point does it become illegal? What about a person and an illegal image? What about an ai generated borderline image and an illegal image? At some point, a legal image changes into an illegal image, and that point is nearly impossible to define. Likewise, the transition between a real and imagined person is the same, or the likeness between two similar looking real, but different, or imagined people.

              • DogMuffins@discuss.tchncs.de
                link
                fedilink
                English
                arrow-up
                1
                ·
                7 months ago

                that point is nearly impossible to define

                As with any law, there will undoubtedly be cases in which it is difficult to discern whether or not a law has been broken, but courts decide on innocence or guilt in such cases every day. A jury would be asked to decide whether a a reasonable third party is likely to conclude on the balance of probabilities that the image depicts a person who is under 18.

                Whether or not the depicted person is real or imagined is not relevant in many / most jurisdictions.

        • rchive@lemm.ee
          link
          fedilink
          arrow-up
          1
          arrow-down
          4
          ·
          8 months ago

          If you make a picture today of someone based on how they looked 10 years ago, we say it’s depicting that person as the age they were 10 years ago. How is what age they are today relevant?

        • rchive@lemm.ee
          link
          fedilink
          arrow-up
          2
          arrow-down
          5
          ·
          8 months ago

          If you make a picture today of someone based on how they looked 10 years ago, we say it’s depicting that person as the age they were 10 years ago. How is what age they are today relevant?

          • GeneralVincent@lemmy.world
            link
            fedilink
            English
            arrow-up
            5
            ·
            8 months ago

            I’m unsure of the point you’re trying to make?

            It’s relevant in this case because the age they are today is underage. A picture of them 10 years ago is underage. And a picture of anyone made by AI to deep fake them nude is unethical irregardless of age. But it’s especially concerning when the goal is to depict underage girls as nude. The age thing specifically could get a little complicated in certain situations ig, but the intent is obvious most of the time.

            • rchive@lemm.ee
              link
              fedilink
              arrow-up
              1
              arrow-down
              1
              ·
              8 months ago

              I’m obviously not advocating or defending any particular behavior.

              Legally speaking, why is what age they are today relevant rather than the age they are depicted as in the picture? Like, imagine we have a picture 20 years from now of someone at age 37. It’s legally fine until it’s revealed it was generated in 2023 when the person in question was 17? If the exact same picture was generated a year later it’s fine again?

              • DogMuffins@discuss.tchncs.de
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                1
                ·
                8 months ago

                Basically, yes.

                Is the person under-age at the time the image was generated? and … Is the image sexual in nature?

                If yes, then generating or possessing such an image ought to be a crime.

      • Fal@yiffit.net
        link
        fedilink
        English
        arrow-up
        14
        arrow-down
        6
        ·
        8 months ago

        Won’t somebody think of the make believe computer generated cartoon children?!

        • legios@aussie.zone
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          4
          ·
          8 months ago

          Australia too. Hentai showing underage people is illegal here. From my understanding it’s all a little grey depending on the state and whether the laws are enforced, but if it’s about victimisation the law will be pretty clear.

          • Fal@yiffit.net
            link
            fedilink
            English
            arrow-up
            17
            arrow-down
            6
            ·
            8 months ago

            Absolutely absurd. Criminalizing drawings is the stupidest thing in the world.

            This case should already be illegal under harassment or similar laws. There’s no reason to make drawings illegal

            • Metz@lemmy.world
              link
              fedilink
              English
              arrow-up
              6
              ·
              8 months ago

              In germany even a written story about it is illegal. it is considered “textual CSAM” then.

            • Wilibus@lemmy.world
              link
              fedilink
              arrow-up
              8
              arrow-down
              3
              ·
              8 months ago

              Nah dude, I am perfectly cool with animated depictions of child sexual exploitation being in the same category as regualr child exploitation regardless of the fact that she’s actually a 10,000 old midget elf or whatever paper thin explanation they provide not to be considered paedos.

              • Fal@yiffit.net
                link
                fedilink
                English
                arrow-up
                4
                arrow-down
                2
                ·
                8 months ago

                Well that’s just absurd and you should rethink your position using logic rather than emotion.

                • zbyte64@lemmy.blahaj.zone
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  arrow-down
                  2
                  ·
                  edit-2
                  8 months ago

                  “that’s just absurd”

                  Well that’s an emotional response that includes no specifics or appeals to logic.

                  “rethink your position using logic rather than emotion”

                  Lol.

  • Aceticon@lemmy.world
    link
    fedilink
    arrow-up
    45
    arrow-down
    4
    ·
    edit-2
    8 months ago

    There might be an upside to all this, though maybe not for these girls: with enough of this people will eventually just stop believing any nude pictures “leaked” are real, which will be a great thing for people who had real nude pictures leaked - which, once on the Internet, are pretty hard to stop spreading - because other people will just presume they’re deepfakes.

    Mind you, it would be a lot better if people in general culturally evolved beyond being preachy monkeys who pass judgment on others because they’ve been photographed in their birthday-suit, but that’s clearly asking too much so I guess simply people assuming all such things are deepfakes until proven otherwise is at least better than the status quo.

  • gandalf_der_12te@feddit.de
    link
    fedilink
    arrow-up
    66
    arrow-down
    25
    ·
    8 months ago

    Honest opinion:

    We should normalize nudity.

    That’s the only healthy relationship that we can have with our bodies in the long term.

    • SuddenDownpour@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      69
      arrow-down
      2
      ·
      8 months ago

      There’s a pretty big fucking difference between normalizing nudity and people putting the faces of 14 year olds in porn video through deepfakes.

      • Lemming6969@lemmy.world
        link
        fedilink
        arrow-up
        10
        arrow-down
        13
        ·
        8 months ago

        Good luck both policing it and having a society with a healthy relationship with our biology and Ai technology without some sort of societal perspective change.

    • Basil@lemmings.world
      link
      fedilink
      arrow-up
      63
      arrow-down
      5
      ·
      8 months ago

      This isn’t even the problem going on, though? Sure, normalize nudity, whatever, that doesn’t fix deep faked porn of literal children.

    • GiddyGap@lemm.ee
      link
      fedilink
      arrow-up
      46
      arrow-down
      1
      ·
      edit-2
      8 months ago

      Having spent many years in both the US and multiple European countries, I can confidently say that the US has the weirdest, most unnatural, and most unhealthy relationship with nudity.

    • ParsnipWitch@feddit.de
      link
      fedilink
      arrow-up
      14
      arrow-down
      2
      ·
      8 months ago

      For this to happen people would probably need to stop judging people on their bodies. I am pretty sure there is a connection there. With how extremely superficial media and many relationships are, and with how we value women in particular, this needs a lot of change in people and society.

      I also think it would be a good thing, but we still have to do something about it until we reach that point.

  • TheEighthDoctor@lemmy.world
    link
    fedilink
    arrow-up
    28
    arrow-down
    3
    ·
    8 months ago

    What’s the fundamental difference between a deep fake and a good Photoshop and why do we need more laws to regulate that?

    • UlrikHD@programming.dev
      link
      fedilink
      arrow-up
      25
      arrow-down
      1
      ·
      8 months ago

      Lower skill ceiling. One option can be done by pretty much anyone at a high volume output, the other would require a lot training and are not available for your average basement dweller.

      Good luck trying to regulate it though, Pandora’s box is opened and you won’t be able to stop the FOSS community from working on the tech.

        • UlrikHD@programming.dev
          link
          fedilink
          arrow-up
          5
          ·
          8 months ago

          Photoshop (if it does?) and any other believable face swap apps use some sort of neural networks, which is exactly the problematic tech we are talking about.

  • Treczoks@lemm.ee
    link
    fedilink
    arrow-up
    24
    arrow-down
    3
    ·
    8 months ago

    The problem is how to actually prevent this. What could one do? Make AI systems illegal? Make graphics tools illegal? Make the Internet illegal? Make computers illegal?

      • Llewellyn@lemm.ee
        link
        fedilink
        arrow-up
        8
        arrow-down
        1
        ·
        8 months ago

        Severity of punishment works poorly. Inevitability, on the other hand…

        • ParsnipWitch@feddit.de
          link
          fedilink
          arrow-up
          3
          ·
          8 months ago

          I think in this case less mild punishment would send the appropriate signal that this isn’t just a little joke or a small misdemeanor.

          There are still way too many people who believe sexual harassment etc. aren’t that huge of a deal. And I believe the fact that perpetrators so easily get away with it plays into this.

          (I am not sure how it is in the US, in my country the consequence of crimes against bodily autonomy are laughable.)

  • virock@lemmy.world
    link
    fedilink
    arrow-up
    24
    arrow-down
    3
    ·
    8 months ago

    I studied Computer Science so I know that the only way to teach an AI agent to stop drawing naked girls is to… give it pictures of naked girls so it can learn what not to draw :(

    • rustydomino@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      8 months ago

      hmmm - I wonder it makes sense to use generative AI to create negative training data for things like CP. That would essentially be a victimless way to train the AIs. Of course, that creates the conundrum of who actually verifies the AI-generated training data…

      • gohixo9650@discuss.tchncs.de
        link
        fedilink
        arrow-up
        2
        ·
        8 months ago

        this doesn’t work. AI still needs to know what is CP in order to create CP for negative use. So you need to first feed it with CP. Recent example of how OpenAI was labelling “bad text”

        The premise was simple: feed an AI with labeled examples of violence, hate speech, and sexual abuse, and that tool could learn to detect those forms of toxicity in the wild. That detector would be built into ChatGPT to check whether it was echoing the toxicity of its training data, and filter it out before it ever reached the user. It could also help scrub toxic text from the training datasets of future AI models.

        To get those labels, OpenAI sent tens of thousands of snippets of text to an outsourcing firm in Kenya, beginning in November 2021. Much of that text appeared to have been pulled from the darkest recesses of the internet. Some of it described situations in graphic detail like child sexual abuse, bestiality, murder, suicide, torture, self harm, and incest.

        source: https://time.com/6247678/openai-chatgpt-kenya-workers/

  • Snot Flickerman@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    25
    arrow-down
    5
    ·
    edit-2
    8 months ago

    Maybe it is just me, but its why I think this is a bigger issue than just Hollywood.

    The rights to famous people’s “images” are bought and sold all the time.

    I would argue that the entire concept should be made illegal. Others can only use your image with your explicit permission and your image cannot be “owned” by anyone but yourself.

    The fact that making a law like this isn’t a priority means this will get worse because we already have a society and laws that don’t respect our rights to control of our own image.

    A law like this would also remove all the questions about youth and sex and instead make it a case of misuse of someone else’s image. In this case it could even be considered defamation for altering the image to make it seem like it was real. They defamed her by making it seem like she took nude photos of herself to spread around.

    • Dark Arc@social.packetloss.gg
      link
      fedilink
      English
      arrow-up
      33
      ·
      edit-2
      8 months ago

      There are genuine reasons not to give people sole authority over their image though. “Oh that’s a picture of me genuinely doing something bad, you can’t publish that!”

      Like, we still need to be able to have a public conversation about (especially political) public figures and their actions as photographed

      • Snot Flickerman@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        edit-2
        8 months ago

        Yeah I’m not stipulating a law where you can’t be held accountable for actions. Any actions you take as an individual are things you do that impact your image, of which you are in control. People using photographic evidence to prove you have done them is not a misuse of your image.

        Making fake images whole cloth is.

        The question of whether this technology will make such evidence untrustworthy is another conversation that sadly I don’t have enough time for right this moment.

    • Zetta@mander.xyz
      link
      fedilink
      arrow-up
      11
      arrow-down
      4
      ·
      8 months ago

      That sounds pretty dystopian to me. Wouldn’t that make filming in public basically illegal?

      • ParsnipWitch@feddit.de
        link
        fedilink
        arrow-up
        10
        arrow-down
        1
        ·
        edit-2
        8 months ago

        In Germany it is illegal to make photos or videos of people who are identifieable (faces are seen or closeups) without asking for permission first. With exception of public events, as long as you do not focus on individuals. It doesn’t feel dystopian at all, to be honest. I’d rather have it that way than ending up on someone’s stupid vlog or whatever.

    • CleoTheWizard@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      8 months ago

      The tools used to make these images can largely be ignored, as can the vast majority of what AI creates of people. Fake nudes and photos have been possible for a long time now. The biggest way we deal with them is to go after large distributors of that content.

      When it comes to younger people, the penalty should be pretty heavy for doing this. But it’s the same as distributing real images of people. Photos that you don’t own. I don’t see how this is any different or how we treat it any differently than that.

      I agree with your defamation point. People in general and even young people should be able to go after bullies or these image distributors for damages.

      I think this is a giant mess that is going to upturn a lot of what we think about society but the answer isn’t to ban the tools or to make it illegal to use the tools however you want. The solution is the same as the ones we’ve created, just with more sensitivity.

  • leaky_shower_thought@feddit.nl
    link
    fedilink
    arrow-up
    7
    arrow-down
    1
    ·
    8 months ago

    reading this, I don’t really know what is supposed to be protected here to be deemed possible of protections in the first place.

    closest reasonable one is the girl’s “identity”, so it could be fraud. but it’s not used to fool people. more likely, those getting the pics already consented this is ai generated.

    so might be defamation?

    the image generation tech is already easily accessible so the girl’s picture being easily accessible might be the weakest link?

      • DarkGamer@kbin.social
        link
        fedilink
        arrow-up
        2
        arrow-down
        2
        ·
        8 months ago

        Thanks for the valuable contribution to this discussion! It does appear this is a question of identity and personality rights, regarding how one wants to be portrayed.

        Reading that article though, it seems like that only applies to commercial purposes. If one is making deep fakes for their own non-commercial private use, it doesn’t appear personality rights apply.

      • Fal@yiffit.net
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        5
        ·
        8 months ago

        Pretty sure it’s illegal to create sexual images of children, photos or not.

        Maybe in your distopian countries where drawings are illegal. Absolutely absurd you’re promoting that as a good thing.

          • Fal@yiffit.net
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            8 months ago

            Yes but this thread is about just drawings in general. Deep faking someone into porn and spreading it around should absolutely be yourself. But it’s not “child porn”. It’s some type of harassment or defamation or something

  • ZombiFrancis@sh.itjust.works
    link
    fedilink
    arrow-up
    8
    arrow-down
    2
    ·
    8 months ago

    In previous generations the kid making fake porn of their classmates was not a well liked kid. Is that reversed now? On the basis of quality of tech?

    • cannache@slrpnk.net
      link
      fedilink
      arrow-up
      0
      arrow-down
      9
      ·
      8 months ago

      Oooh that’s bad. Yeah I would never do that but I did hear about the idea floating around back in the day, though I don’t think the tech is there yet. It’s just generally not cool

  • Gork@lemm.ee
    link
    fedilink
    arrow-up
    6
    arrow-down
    1
    ·
    8 months ago

    President Joe Biden signed an executive order in October that, among other things, called for barring the use of generative AI to produce child sexual abuse material or non-consensual “intimate imagery of real individuals.” The order also directs the federal government to issue guidance to label and watermark AI-generated content to help differentiate between authentic and material made by software.

    Step in the right direction, I guess.

    How is the government going to be able to differentiate authentic images/videos from AI generated ones? Some of the AI images are getting super realistic, to the point where it’s difficult for human eyes to tell the difference.

    • CommanderCloon@lemmy.ml
      link
      fedilink
      arrow-up
      6
      ·
      8 months ago

      I wouldn’t call this a step in the tight direction. A call for a step yeah, but it’s not actually a step until something is actually done

        • Chakravanti@sh.itjust.works
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          edit-2
          8 months ago

          According to what logic? Like I’m ever going to trust some lying asshole to hide his instructions for fucking anything that’s MINE. News Alert: “Your” computer ain’t yours.

          • Olgratin_Magmatoe@startrek.website
            link
            fedilink
            arrow-up
            1
            arrow-down
            1
            ·
            edit-2
            8 months ago

            People have been trying to circumvent chatGPT’s filters, they’ll do the exact same with open source AI. But it’ll be worse because it’s open source, so any built in feature to prevent abuse could just get removed then recompiled by whoever.

            And that’s all even assuming there ever ends up being open source AI.

            • Chakravanti@sh.itjust.works
              link
              fedilink
              arrow-up
              1
              ·
              8 months ago

              You’re logic is bass ackwards. Knowing the open source publicly means the shit gets fixed faster. Closed source just don’t get fixed %99 of the time because there’s only one mother fucker to do the fixing and usually just don’t do it.

              • Olgratin_Magmatoe@startrek.website
                link
                fedilink
                arrow-up
                1
                ·
                8 months ago

                You can’t fix it with open source. All it takes is one guy making a fork and removing the safeguards because they believe in free speech or something. You can’t have safeguards against misuse of a tool in an open source environment.

                I agree that closed source AI is bad. But open source doesn’t magically solve the problem.

                • Chakravanti@sh.itjust.works
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  8 months ago

                  Forks are productive. Your’re just wrong about it. I’ll take FOSS over closed source. I’ll trust the masses reviewing FOSS over the one asshole doing, or rather not doing, exactly that.

  • PhantomAudio@lemm.ee
    link
    fedilink
    arrow-up
    4
    arrow-down
    1
    ·
    8 months ago

    gee here is a novel idea, dont let children have access to social media. that would solve a lot of other problems also

    • Sandbag@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      7 months ago

      While I agree with that in principle, we shouldn’t start blocking people, even young people, access to a lot of information.

      Twitter, while now a cesspool, still has a lot of academics on it that share new ideas and discoveries.

      Reddit, while shit, also has the value of helping people find niche hobbies and communities.

      YouTube, while turning into shit, allows people access to video tutorials and explanations, hell while I was in school half the time teachers assigned hw that we would need to watch a YouTube video.

      While it’s an idea to block the youth from accessing social media, the drawbacks I think are too much.

  • PhlubbaDubba@lemm.ee
    link
    fedilink
    arrow-up
    5
    arrow-down
    6
    ·
    8 months ago

    Sounds like an easy fix, treat it as revenge porn and CEM and prosecute it exactly the same.

    Little Timmy’s gonna think twice about distributing stable diffusions of the cheerleaders after he sees mikey’s life get ruined for that shit

    • DogMuffins@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      2
      ·
      8 months ago

      I don’t think kids think about consequences in this way. Also not sure if charging a 12 year old as a paedophile is the right move.

        • DogMuffins@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          8 months ago

          I’m not certain but I think in my jurisdiction I think distribution of CSM is the more likely charge than revenge porn.

          For a 40 year old, spending 10 years in jail and the rest of your life on the sex offenders registry would be a deterrent of some kind. I ought not to do x because I don’t want to bear consequence y.

          For a 12 year old, even if they understand that some behaviors have very deleterious consequences, they have no way to weigh those consequences. How long is 10 years? Would this be a bit like being sent to my room? What is a registry? Making these pictures on the computer is illegal, downloading torrents is illegal, dad downloads torrents all the time.

          I’m just saying that if the objective is to avoid the harm of victims, then heavy punishments are unlikely to achieve that.

    • kent_eh@lemmy.ca
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      2
      ·
      8 months ago

      Little Timmy’s gonna think twice

      I appreciate your optimism , but I don’t share it.

      Timmy is as likely to believe he is smarter than Mikey and that he won’t get caught…