• Kerfuffle@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    23
    arrow-down
    6
    ·
    1 year ago

    Doubled down on the “yea were not gonna credit artist’s our AI stole from”. What a supreme douche

    I don’t think it’s as simple as all that. Artists look at other artists’ work when they’re learning, for ideas, for methods of doing stuff, etc. Good artists probably have looked at a ton of other artwork, they don’t just form their skills in a vacuum. Do they need to credit all the artists they “stole from”?

    In the article, the company made a point about not using AI models specifically trained on a smaller set of works (or some artist’s individual works). Doing something like that would be a lot easier to argue that it’s stealing: but the same would be true if a human artist carefully studied another person’s work and tried to emulate their style/ideas. I think there’s a difference between that an “learning” (or learning) for a large body of work and not emulating any specific artist, company, individual works, etc.

    Obviously it’s something that needs to be handled fairly carefully, but that can be true with human artists too.

    • [email protected]@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      3
      ·
      1 year ago

      I swear I’m old enough to remember this exact same fucking debate when digital tools started becoming popular.
      It is, simply put, a new tool.
      It’s also not the one and done magic button people who’ve never used shit think it is.

      The knee-jerk reaction of hating on every art made with AI, is dangerous.
      You’re free to like it or not, but it’s already out of the hat.
      Big companies will have the ressources to train their own model.
      I for one would rather have it in the public domain rather than only available to big corps.

    • loobkoob@kbin.social
      link
      fedilink
      arrow-up
      3
      arrow-down
      2
      ·
      edit-2
      1 year ago

      I wouldn’t call myself a “good artist” at all, and I’ve never released anything, I just make music for myself. Most of the music I make starts with my shamelessly lifting a melody, chord progression, rhythm, sound, or something else, from some song I’ve heard. Then I’ll modify it slightly, add my own elements elsewhere, modify the thing I “stole” again, etc, and by the time I’ve finished, you probably wouldn’t even be able to tell where I “stole” from because I’ve iterated on it so much.

      AI models are exactly the same. And, personally, I’m pretty good at separating the creative process from the end result when it comes to consuming/appreciating art. There are songs, paintings, films, etc, where the creative process is fascinating to me but I don’t enjoy the art itself. There are pieces of art made by sex offenders, criminals and generally terrible people - people who I refuse to support financially in any way - but that doesn’t mean my appreciation for the art is lessened. I’ll lose respect for an artist as a person if I find out their work is ghostwritten, but I won’t lose my appreciation for the work. So if AI can create art I find evocative, I’ll appreciate that, too.

      But ultimately, I don’t expect to see much art created solely by AI that I enjoy. AI is a fantastic tool, and it can lead to some amazing results when someone gives it the right prompts and edits/curates its output in the right way. And it can be used for inspiration, and to create a foundation that artists can jump off, much like I do with my “stealing” when I’m writing music. But if someone gives an AI a simple prompt, they tend to get a fairly derivative result - one that’ll feel especially derivative as we see “raw output” from AIs more often and become more accustomed to their artistic voice. I’m not concerned at all about people telling an AI to “write me a song about love” replacing the complex prog musicians I enjoy, and I’m not worried about crappy AI-generated games replacing the lovingly crafted experiences I enjoy either.

    • Franzia@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      5
      ·
      1 year ago

      Artists who look at art are processing it in a relatable, human way. An AI doesnt look at art. A human tells the AI to find art and plug it in, knowing that work is copyrighted and not available for someone else’s commercial project to develop an AI.

      • Kerfuffle@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        edit-2
        1 year ago

        Artists who look at art are processing it in a relatable, human way.

        Yeah, sure. But there’s nothing that says “it’s not stealing if you do it in a relatable, human way”. Stealing doesn’t have anything to do with that.

        knowing that work is copyrighted and not available for someone else’s commercial project to develop an AI.

        And it is available for someone else’s commercial project to develop a human artist? Basically, the “an AI” part is still irrelevant to. If the works are out there where it’s possible to view them, then it’s possible for both humans and AIs to acquire them and use them for training. I don’t think “theft” is a good argument against it.

        But there are probably others. I can think of a few.

        • Franzia@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          2
          ·
          1 year ago

          I just want fucking humans paid for their work, why do you tech nerds have to innovate new ways to lick the boots of capital every few years? Let the capitalists make aeguments why AI should own all of our work, for free, rights be damned, and then profit off of it, and sell that back to us as a product. Let them do that. They don’t need your help.

          • Kerfuffle@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            3
            ·
            1 year ago

            I just want fucking humans paid for their work

            That’s a problem whether or not we’re talking about AI.

            why do you tech nerds have to innovate new ways to lick the boots of capital every few years?

            That’s really not how it works. “Tech nerds” aren’t licking the boots of capitalists, capitalists just try to exploit any tech for maximum advantage. What are the tech nerds supposed to do, just stop all scientific and technological progress?

            why AI should own all of our work, for free, rights be damned,

            AI doesn’t “own your work” any more than a human artist who learned from it does. You don’t like the end result, but you also don’t seem to know how to come up with a coherent argument against the process of getting there. Like I mentioned, there are better arguments against it than “it’s stealing”, “it’s violating our rights” because those have some serious issues.

      • Grumpy@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        4
        ·
        1 year ago

        That’s not how AI art works. You can’t tell it to find art and plug it in. It doesn’t have the capability to store or copy existing artworks. It only contains the matrix of vectors which contain concepts. Concepts cannot be copyrighted.

        • Kerfuffle@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          2
          ·
          1 year ago

          You can’t tell it to find art and plug it in.

          Kind of. The AI doesn’t go out and find/do anything, people include images in its training data though. So it’s the human that’s finding the art and plugging it in — most likely through automated processes that just scrape massive amounts of images and add them to the corpus used for training.

          It doesn’t have the capability to store or copy existing artworks. It only contains the matrix of vectors which contain concepts.

          Sorry, this is wrong. You definitely can train AI to produce works that are very nearly a direct copy. How “original” works created by the AI are is going to depend on the size of the corpus it got trained on. If you train the AI (or put a lot of weight on) training for just a couple works from one specific artist or something like that it’s going to output stuff that’s very similar. If you train the AI on 1,000,000 images from all different artists, the output isn’t really going to resemble any specific artist’s style or work.

          That’s why the company emphasized they weren’t training the AI to replicate a specific artist’s (or design company, etc) works.

          • Grumpy@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            Sorry, this is wrong.

            As a general statement: No, I am not. You’re making an over specific scenario to make it true. Sure, if I take 1 image and train a model just on that one image, it’ll make that exact same image. But that’s no different than me just pressing copy and paste on a single image file. The latter does the job whole lot better too. This entire counter argument is nothing more than being pedantic.

            Furthermore, if I’m making such specific instructions to the AI, then I am the one who’s replicating the art. It doesn’t matter if I use a pencil to trace out the existing art, using photoshop, or creating a specific AI model. I am the one who’s doing that.

            • Kerfuffle@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              As a general statement: No, I am not.

              You didn’t qualify what you said originally. It either has the capability or not: you said it didn’t, it actually does.

              You’re making an over specific scenario to make it true.

              Not really. It isn’t that far-fetched that a company would see an artist they’d like to use but also not want to pay that artist’s fees so they train an AI on the artist’s portfolio and can churn out very similar artwork. Training it on one or two images is obviously contrived, but a situation like what I just mentioned is very plausible.

              This entire counter argument is nothing more than being pedantic.

              So this isn’t true. What you said isn’t accurate with the literal interpretation and it doesn’t work with the more general interpretation either. The person higher in the thread called it stealing: in that case it wasn’t, but AI models do have the capability to do what most people would probably call “stealing” or infringing on the artist’s rights. I think recognizing that distinction is important.

              Furthermore, if I’m making such specific instructions to the AI, then I am the one who’s replicating the art.

              Yes, that’s kind of the point. A lot of people (me included) would be comfortable calling doing that sort of thing stealing or plagiarism. That’s why the company in OP took pains to say they weren’t doing that.