Background

Adobe Introduces 'Adobe Content Authenticity' To Help Creators Fight AI-Generated Forgery

Adobe

AI generated content is on the rise. Thanks to Large Language Models (LLMs), pretty much everyone has superpowers.

The AIs ability to replicate things is based on the data they've been trained on. Since OpenAI debuted ChatGPT, the company and many others started unleashing their AIs to learn from practically anything they could put their hands on, the AIs have become so smart that authenticity is becoming a concern.

For more than many years, websites on the web use robots.txt file to instruct web crawlers on which parts of their sites should not be accessed.

Adobe aims to establish a similar standard for images.

The company calls the initiative the 'Content Credentials' system.

With it, Adobe wants creators to be able to embed metadata into their images, signaling a preference that their content not be used for training AI models .​

In a blog post, Adobe said that:

"Creators have always deserved proper attribution for the powerful digital work they create. Their creations shape culture, drive engagement and fuel industries. But in today’s rapidly evolving digital landscape, especially with the rise of generative AI, creators need a modern solution that’s tailored to the realities of this digital ecosystem. They need a tool that secures attribution, while supporting greater transparency, accountability and protection."

"That’s why today we’re excited to launch the public beta of Adobe Content Authenticity, a free app that allows creators to easily apply Content Credentials to their digital work. Content Credentials are a secure type of metadata that allows creators to share information about themselves and their work — effectively signing their work digitally, much like an artist signing a painting or sculpture. By adopting Content Credentials today, creators are helping build a more transparent, accountable, and creator-friendly digital ecosystem."

Content Credentials are part of Adobe's Content Authenticity Initiative (CAI), a collaborative effort launched in 2019, along with Coalition for Content Provenance and Authenticity (C2PA), a consortium of over 100 companies working together to create technology with secure metadata, invisible and undetectable watermarking, and breakthrough digital fingerprinting.

And here, the Content Credentials system, allows creators to embed secure metadata into their digital work.

This metadata can include information about the creator, the creation process, and any edits made to the content. The goal is to provide a verifiable record of a piece of content's provenance, helping to restore trust in digital media. ​

This type of content credential is known as “durable," and that it remains securely connected throughout the life cycle of content even if a screenshot is taken.

Content Credentials focuses on developing open-source tools and standards for verifying the origin and history of digital media, including content created with generative AI. ​

Adobe has also introduced a free web-based application in public beta, enabling creators to apply these Content Credentials to their work.

This app supports bulk tagging of up to 50 JPEG or PNG files and includes tools to opt out of AI training datasets. It also offers enhanced authentication through LinkedIn verification and integration with Adobe’s Behance platform. ​

The CAI's efforts are part of a broader industry movement to address the challenges posed by AI-generated content and digital misinformation.

By providing tools for content verification and promoting transparency, the initiative seeks to empower creators and audiences alike in the digital age

While this marks a good move by Adobe, whose goal is to help content creators and their original artwork, convincing AI companies to adhere to Adobe’s standard may be challenging.

For almost as long as robots.txt exist, a lot of naughty crawlers ignore the requests to not crawl specific files or locations.

Since the LLM trend is still going up in popularity and demand, a lot of AI crawlers are also known to just scrape everything they can get their hands on, legally and illegally.

Published: 
24/04/2025