AI generators: not so clean cut

Which side will you choose to sit? Will Bright

Where do we draw the line between inspiration, plagiarism, and theft?

AI is taking over. Well, not really, but I’m sure we’ve all seen it becoming more and more prevalent in recent times. There is the TikTok trend of turning yourself into an anime picture with AI. There is ChatGPT writing essays and descriptions for people based on prompts. Then, there was that trend of celebrities paying a little bit of money to see AI pictures of themselves and share them online.

How do AI art generators actually make art? The answer is simple: theft. I’ll elaborate more on that later.

Right now, three major companies in the AI art game are being sued: Stability AI, Midjourney, and DeviantArt. They are being sued by three artists: Sarah Andersen, Kelly McKernan, and Karla Ortiz.

Many people know Sarah Andersen from her cartoon Sarah’s Scribbles, wherein she makes jokes about being introverted. Even if you don’t know it by name, you’ve probably seen Andersen’s comic strips online. Andersen has copyright registrations for several of her comic collections and despite these copyright registrations, her work has still been used to train Stable Diffusion art.

Kelly McKernan works full-time as an artist and works in traditional art painting, including watercolour and acrylic gouache. They have found that at least 30 pieces of their art have been used to train AI.

Karla Ortiz has worked on large projects for film, video games, tabletop role-playing games (TTRPGs), and television. She creates realistic artwork and has been featured in galleries in Paris. Ortiz has won awards for her art, and has found that at least 12 of her works were used to train AI. Ortiz said that when she found out AI generators could recreate her style in 2021, she said “it felt invasive in a way that [she has] never experienced.”

Stability AI is a company aiming to make open AI tools for all to use. They are working in language, audio, video, and biology. In their frequently asked questions section on their website, Stability AI claims it makes money by providing “unparalleled foundation model consulting and contracting services to clients.”

Midjourney states that it is “an independent research lab exploring new mediums of thought and expanding the imaginative powers of the human species.” With Midjourney you can make 25 AI-generated images for free, but then you have to pay $10 a month to have up to 200 images. You can only access Midjourney through the Discord app.

The final group being sued in this lawsuit is DeviantArt. Whether you’re an artist or not, it’s likely you’ve heard of DeviantArt. It’s a website for artists to share their work with other artists. It’s been around since 2000, and by 2017, DeviantArt had over 25 million members. As of 2017, they’ve been owned by the website-making company Recently, DeviantArt announced and launched DreamUp, their own AI art generator.

Stability AI, Midjourney, and DeviantArt’s DreamUp all use Stable Diffusion to run their AI art generators. Stable Diffusion was created and developed by Stability AI. According to Stability AI, Stable Diffusion uses the underlying dataset LAION 5b to train it to create art. LAION 5b uses sources like DeviantArt, Flickr, and Pinterest as training images to teach the AI what to create. Using the “diffusion” technique, the AI learned how to create new images by creating exact copies of the original images found on Pinterest, etc. On Stability AI’s website, it says that artists have no choice whether or not their art is used for LAION 5b, because it wants to have a generalized idea of the art online. 

The issue with Stable Diffusion is that it does not consider copyright when it takes images to study. Artists’ styles can easily be recreated within any of the mentioned AI generators. It enters into a legal grey area of whether or not the output of the generator is a copyright violation against the input (the original art).

Here is where we get to the heart of the lawsuit. Ortiz, McKernan, and Andersen are claiming that their copyright has been violated because their art has been used to train AI generations.

The lawsuit claims that billions of copyrighted images were used to train Stable Diffusion and that they had no permission from the artists to use their images in the training. As a result of using those images as training, they have now been stored as compressed copies in Stable Diffusion, without consent of the artists or payment to them for their work. The lawsuit claims “These ‘new’ images are based entirely on the Training Images and are derivative works of the particular images Stable Diffusion draws from when assembling a given output. Ultimately, it is merely a complex collage tool.”

The Plaintiffs are also claiming that they are losing money that they would have gotten from commissions because people are searching for their art styles through AI image generators. They don’t want their work and their livelihoods to be eliminated by AI programs that are built upon their work. Stability AI responded by saying “anyone that believes that this isn’t fair use does not understand the technology and misunderstands the law.”

Now joining in on suing Stability AI is Getty Images. Getty Images has stated that Stability AI has taken millions of its images that were copyright protected for their AI training. They have said that Stability AI did not ask for permission or licensing to use their images. Many people have seen a Getty Images watermark, or what seems like it could be, on their AI-generated images. Similarly, many people have seen garbled artists’ watermarks on AI-generated images.

These artists are asking questions and pursuing lawsuits that could change the future of art. If they win the lawsuit, it will change how AI art generators will function in the future and how they will be trained. If they lose the lawsuit, it will be a major blow to artists everywhere who are losing income and work because of AI image generators.

More information will be revealed in the coming months as the two lawsuits unfold.


Comments are closed.