Adobe has addressed the removal of specific content from Adobe Firefly’s underlying AI training images after recent outcry over their use. Now they’ve released a new content credential app that gives creators more control of their work. Indeed, Meta just recently launched its own labels to automatically tag images on its platform. This recent decision has sparked an intense response from photographers and artists. Creators contributed to the development of the new app. It enables users to embed credentials in batches of JPG or PNG files, addressing issues of image ownership and credit in an AI world.
Their recent decision to slap “Made with AI” labels on every generated image ruffled the feathers of countless photographers. Specifically, they felt the edited works were improperly classified as infringing. The source of that frustration was a feeling that these labels did a poor job of representing the nature of their artistic work. Andy Parsons is Senior Director of Adobe’s Content Authenticity Initiative. He did stress the urgent need for some kind of solution that recognizes the complexities of image creation in today’s digital world.
Parson took special note of the grey area when an image is created, edited with the help of AI but not solely created by AI. Afterwards, he called for artists and creators to be permitted to sign their work and receive appropriate attribution. This does not tell us that the IP is legitimate or even that it is copyrightable, but only that someone created it.
The new content credential application allows users to easily attach credentials to their images. This new feature gives creators a way to confidently prove their authorship. You can attach your Instagram or X (formerly Twitter) accounts to a picture. Except, right now there is no integration with the platforms’ own verification systems. This feature is another example of Adobe’s continuing emphasis on promoting transparency and accountability in the digital content ecosystem.
Parson noted that small creators and agencies are excited about having more control over their creations. Specifically, they want greater control over how their creative work is used to train AI. “Content creators want a simple way to indicate that they don’t want their content to be used for gen AI training,” he remarked. “We have heard from small creators and agencies that they want more control over their creations [in terms of AI training on their content].”
The app helps further the mission of the Coalition for Content Provenance and Authenticity (C2PA), an industry initiative to fight misinformation. This new coalition is similar to the Content Authenticity Initiative headed up by Parson. C2PA advocates for a framework that allows creators to attach provenance information to their work without dictating what constitutes art. This philosophy stresses creators’ respect for the artistic integrity of their work while promoting an ownership model that gives creators a vested interest in their contributions.
Yet as AI rapidly progresses, the relationship between technology and art is more complicated than ever before. Creators should be driving this conversation. They are the ones on the frontlines, figuring out how to defend their creative assets while still navigating the new frontier AI presents. Adobe’s new tool goes a step further to proactively address these issues. California’s law fosters a culture in which innovation can flourish alongside new technology.
Leave a Reply