I totally agree with a streamlined identification of images generated by an AI prompt. But, to label an image with “made with AI” metadata when the image is original, taken by a human, and simply used AI tools to edit is absolutely misleading and the language can create confusion. It is not fair to the individual who has created the original work without the use if generative AI. I simply propose revising the language to create distinction.
Where I live, is very difficult to get permits to knock down an old building and build a new one. So, builders will “renovate” by knocking down everything but a single wall and then building a new structure around it.
I can imagine people using that to get around the “made with ai” label. I just touched it up!
It’s like they’re ignoring the pixel I captured in the bottom left!
Really interesting analogy.
Also I imagine most anybody who gets a photo labeled will find a trick before making their next post. Copy the final image to a new PSD… print and scan for the less technically inclined… heh
The edits are what makes it made with AI. The original work obviously isn’t.
If you’re in-painting areas of an image with generative AI (“context aware” fill), you’ve used AI to create an image.
People are coming up with rather arbitrary distinctions between what is and isn’t AI. Midjourney’s output is clearly AI, and a drawing obviously isn’t, but neither is very post-worthy. Things quickly get muddy when you start editing.
The people upset over this have been using AI for years and nobody cared. Now photographers are at risk of being replaced by an advanced version of the context aware fill they’ve been using themselves. This puts them in the difficult spot of wanting not to be replaced by AI (obviously) but also not wanting to have their AI use be detectable.
The debate isn’t new; photo editors had this problem years ago when computers started replacing manual editing, artists had this problem when computer aided drawing (drawing tablets and such) started becoming affordable, and this is just the next step of the process.
Personally, I would love it if this feature would also be extended to “manual” editing. Add a nice little “this image has been altered” marker on any edited photographs, and call out any filters used to beautify selfies while we’re at it.
I don’t think the problem is that AI edited images are being marked, the problem AI that AI generated pictures and manually edited pictures aren’t.
I totally agree with a streamlined identification of images generated by an AI prompt. But, to label an image with “made with AI” metadata when the image is original, taken by a human, and simply used AI tools to edit is absolutely misleading and the language can create confusion. It is not fair to the individual who has created the original work without the use if generative AI. I simply propose revising the language to create distinction.
Where I live, is very difficult to get permits to knock down an old building and build a new one. So, builders will “renovate” by knocking down everything but a single wall and then building a new structure around it.
I can imagine people using that to get around the “made with ai” label. I just touched it up!
It’s like they’re ignoring the pixel I captured in the bottom left!
Really interesting analogy.
Also I imagine most anybody who gets a photo labeled will find a trick before making their next post. Copy the final image to a new PSD… print and scan for the less technically inclined… heh
I mean you can just remove the metadata of any image, so that doesn’t really matter.
The edits are what makes it made with AI. The original work obviously isn’t.
If you’re in-painting areas of an image with generative AI (“context aware” fill), you’ve used AI to create an image.
People are coming up with rather arbitrary distinctions between what is and isn’t AI. Midjourney’s output is clearly AI, and a drawing obviously isn’t, but neither is very post-worthy. Things quickly get muddy when you start editing.
The people upset over this have been using AI for years and nobody cared. Now photographers are at risk of being replaced by an advanced version of the context aware fill they’ve been using themselves. This puts them in the difficult spot of wanting not to be replaced by AI (obviously) but also not wanting to have their AI use be detectable.
The debate isn’t new; photo editors had this problem years ago when computers started replacing manual editing, artists had this problem when computer aided drawing (drawing tablets and such) started becoming affordable, and this is just the next step of the process.
Personally, I would love it if this feature would also be extended to “manual” editing. Add a nice little “this image has been altered” marker on any edited photographs, and call out any filters used to beautify selfies while we’re at it.
I don’t think the problem is that AI edited images are being marked, the problem AI that AI generated pictures and manually edited pictures aren’t.
Therefor, made with AI.
Or generated with AI like midjourney, therefore, made with AI.
There a huge difference between the two, yet, no clear distinction when all lumped into the label of “made with AI”