In its submission to the Australian government’s review of the regulatory framework around AI, Google said that copyright law should be altered to allow for generative AI systems to scrape the internet.

  • Pseu@beehaw.org
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    So at the bare minimum, a mechanism needs to be provided for retroactively removing works that would have been opted out of commercial usage if the option had been available and the rights holders had been informed about the commercial intentions of the project.

    If you do this, you limit access to AI tools exclusively to big companies. They already employ enough artists to create a useful AI generator, they’ll simply add that the artist agrees for their work to be used in training to the employment contract. After a while, the only people who have access to reasonably good AI is are those major corporations, and they’ll leverage that to depress wages and control employees.

    The WGA’s idea that the direct output of an AI is uncopyrightable doesn’t distort things so heavily in favor of Disney and Hasbro. It’s also more legally actionable. You don’t name Microsoft Word as the editor of a novel because you used spell check even if it corrected the spelling and grammar of every word. Naturally you don’t name generative AI as an author or creator.

    Though the above argument only really applies when you have strong unions willing to fight for workers, and with how gutted they are in the US, I don’t think that will be the standard.

    • frog 🐸@beehaw.org
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      1 year ago

      The solution to only big companies having access to AI by using enough artists to create a useful generator isn’t to deny all artists globally any ability to control their work, though. If all works can be scraped and added to commercial AI models without any payment to artists, you completely obliterate all artists except for the small handful working for Disney, Hasbro, and the likes.

      AI models actually require a constant input of new human-made artworks, because they cannot create anything new or unique themselves, and feeding an AI content produced by AI ends up with very distorted results pretty quickly. So it’s simply not viable to expect the 99% of artists who don’t work for big companies to continuously provide new works for AI models, for free, so that others can profit from them. Therefore, artists need either the ability to opt out or they need to be paid.

      (The word “artist” here is used to refer to everyone in the creative industries. Writing and music are art just like paintings and drawings are.)

      • Pseu@beehaw.org
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        1 year ago

        Unfortunately, copyright protection doesn’t extend that far. Artists simply don’t have that much say in what viewers do having seen their art, even if they decide to copy its style.

        AI training is almost certainly fair use if it is copying at all. Styles and the like cannot be copyrighted, so even if an AI creates a work in the style of someone else, it is extremely unlikely that the output would be so similar as to be in violation of copyright. Though I do feel that it is unethical to intentionally try to reproduce someone’s style, especially if you’re doing it for commercial gain. But that is not illegal unless you try to say that you are that artist.

        https://www.eff.org/deeplinks/2023/04/how-we-think-about-copyright-and-ai-art-0

        • frog 🐸@beehaw.org
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          Copyright law on this varies, actually! In the UK, “fair dealing” actually has an exclusion for using copyrighted material for the purpose of commercially competing with the creator. This also includes derivative works. This does therefore cover style to a certain extent, because works imitating a style of an artist are generally intended to commercially compete with them. From that perspective, taking an artist’s entire portfolio, feeding it into an AI, and producing work in their style at a lower price than the artist does (because an AI produces something in seconds which takes the artist weeks), is pretty obviously an attempt to compete with the artist commercially.

          While people like to draw comparisons between AIs and humans copying another artist’s style, the big difference here is that a human artist needs to spend hundreds of hours learning to imitate another artist’s style, at the expense of developing their own style, while the original artist is also continually developing their style. It is bloody hard to imitate another human’s art style. But an AI can do it in minutes, and I haven’t yet seen any valid arguments for how that’s not intended to commercially compete with human artists on a massive scale.

          • Pseu@beehaw.org
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            True, I wrote this from a US law perspective, where that kind of behavior is expressly protected. US law is also written specifically to protect things like search engines and aggregators to prevent services like Google from getting sued for their blurbs, but it’s likely also a defense for AI.

            Regardless of if it should be illegal or not, I feel that AI training and use is currently legal under current US law. And as a US company, dragging OpenAI to UK courts and extracting payment from them would be difficult for all but the most monied artists.

            • frog 🐸@beehaw.org
              link
              fedilink
              arrow-up
              1
              ·
              1 year ago

              For the moment, US companies do actually care what the UK courts and regulatory bodies say, because the trifecta of US-UK-EU is what tends to form a base of what the rest of the world decides. It’s why Microsoft have been so unhappy about the UK’s Competition and Markets Authority initially blocking the merger with Blizzard: even with the US and EU antitrust bodies agreeing to it, it did actually matter if the UK didn’t agree (I am so disappointed in the CMA finally capitulating). And some of the lawsuits against the AI companies are taking place in the UK courts, with no indications that the AI companies are refusing to engage. Obviously at this point it’s hard to say what the outcome will be, but the UK legal system does actually have enough clout globally that it won’t be a meaningless result.