The open-source threat to brand safety in the AI age | WARC | The Feed
The Feed
Daily effectiveness insights, curated by WARC’s editors.
You didn’t return any results. Please clear your filters.

The open-source threat to brand safety in the AI age
The story of an AI-rejecting artist, Greg Rukowski, whose style is now widely mimicked by the Stable Diffusion community, despite having opted out of its training data, serves as a warning to brands as the world heads into the generative AI era.
Why art matters
That an artist who has asked that their work be removed from a training set should continue to see their style perpetuated on a platform without their consent, the implications for a brand with consistent assets are quite serious.
The issue appears to be that a system like Stable Diffusion is open source, which means that Stability AI is limited in what it can do to police content if the community decides against it.
In the era of ‘meme stocks’, and in the wake of the conservative backlash to a Bud Light activation that led to a boycott, brand safety at the hands of a hostile community of users becomes a significant threat.
The story
Decrypt, the tech news site, carries the story.
- Rukowski is known for his work on Dungeons & Dragons and Magic: The Gathering. As a result of his extensive back catalogue has become incredibly popular among users of the image-generating AI platform Stable Diffusion.
- So popular, in fact, that his name alone became the most used keyword on Stable Diffusion, used over 400,000 times. A working artist, however, Rukowski like other artists disagreed with Stable Diffusion’s ability for users to invoke artists’ names when creating work – Stability AI removed this feature in its 2.0 update.
- However, the community has now created a LoRa, or a small model that can emulate styles or colours, based on the artist’s style which is now freely available.
- Users are deeply divided on the issue with some saying that the genie is already out of the bottle; others, however, argue that there is an important distinction between what its legally and technically possible and what is ethical.
The marketing view
It’s unlikely that brands will inspire the same sympathy as an independent artist whose imitation is a sincere, if economically threatening, form of flattery.
As the highly controlled Coca-Cola activation, ‘Create Real Magic’ demonstrated, users’ creativity with the help of AI can make a fun and highly engaging campaign.
But the element of control was critical: Coca-Cola designed the system and its parameters. The possibility of LoRas (or mini data training sets) erodes that control.
Sourced from Decrypt, Investopedia, WARC
Email this content