Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

And which standard is that for ai generated art ?


A lot of AI generated images retain the watermark of the copyright image it was trained on. If you sell something with that image with no agreement from the rights holder it is not fair use.

It is completely reasonable for Valve to forbid this until it is sorted out. Keep mind they are a company of IP creators, creating a marketplace for IP creators. The whole reason Steam was created was to establish a DRM that fought the piracy of Half Life. I am on the side of Valve in this.


I believe the AI generates a watermark because so many examples contained it.

Imagine taking a really dumb gig worker, showing him 10000 images, some of them with watermarks, and then telling him "draw a red car, kinda like the kind of images you saw". There's a decent chance you'll get a red car that looks nothing like any cars with the data set (original work), and yet he'll paint a memorable watermark on top because so many examples contained it, you said "kinda like the kind of images you saw", and he doesn't understand that the watermark isn't meant to be part of the picture. I believe that's whats happening.


They don’t ‘retain’ a watermark. They ‘reproduce’ the watermark.

It’s entirely possible for a diffusion model to produce an original work and yet still hallucinate a ‘shutterstock’ watermark onto it, in much the same way as GPT can hallucinate valid-looking citations for legal cases that never happened.


That's a completely fair point a few people have made. But I think the idea is, if you are a creator, the AI is doing something that you might call copying if it were a person. If it does that, what is your redress as the creator?


To correct the common misconception: Sometimes AI image generators insert a watermark because they have seen a lot of watermarks on certain kinds of images during training. This does not mean that the image itself is a copy of any particular image in the training data.

Producing (distorted) copies of images in the training data takes some real effort, and typically only occurs for images which are heavily repeated in the training data... Most of the complaints along these lines can be compared to complaints that cars cause massive bodily harm if you steer them into lightposts: The problem is easily preventable by not driving into a lightpost.


I think the "well it's transformative" argument is pretty bad faith and I think a lot of the people making it might know that.

Generative AI cannot exist without pre-existing bodies of work created by human labor. It also displaces that labor and hurts the people whose content was a requirement for AI to exist. From this view, AI is not fair use.


There are multiple jurisdictions where there have been rumblings that an AI-generated work is possibly a derived work from every single work that the AI was trained with. This hasn't been properly tested in court, but I would give very high odds that the standard will be upheld at least somewhere where Steam sells things.

If this is true, then ordinary copyright law means that AI-generated media cannot be used unless you have a release from every bit of training data you used. At least some of the currently existing AI:s were trained with datasets for which such releases are impossible, so they should not be used.

Also, for the love of god, do not use any of the AI coding assistants, or if you do, at least never publicly admit you do.


> multiple jurisdictions where there have been rumblings that an AI-generated work is possibly a derived work from every single work that the AI was trained with

This should apply to humans as well then because brains ultimately do the exact same thing. Nobody creates art in a vaccuum.


Good




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: