Kickstarter shut down the campaign for AI porn group Unstable Diffusion amid changing guidelines
Image Credits: Unstable Diffusion on Kickstarter, modified by TechCrunch
The group trying to monetize AI porn generation, Unstable Diffusion, raised more than $56,000 on Kickstarter from 867 backers. Now, as Kickstarter changes its thinking about what kind of AI-based projects it will allow, the crowdfunding platform has shut down Unstable Diffusion’s campaign. Since Kickstarter runs an all-or-nothing model and the campaign had not yet concluded, any money that Unstable Diffusion raised will be returned to the funders. In other words, Unstable Diffusion won’t see that $56,000, which more than doubled its initial $25,000 goal.
“Over the last several days, we’ve engaged our Community Advisory Council and we’ve read your feedback to us via our team and social media,” said CEO Everette Taylor in a blog post. “And one thing is clear: Kickstarter must, and will always be, on the side of creative work and the humans behind that work. We’re here to help creative work thrive.”
Kickstarter’s new approach to hosting AI projects is intentionally vague.
“This tech is really new, and we don’t have all the answers,” Taylor wrote. “The decisions we make now might not be the ones we make in the future, so we want this to be an ongoing conversation with all of you.”
Right now, the platform says it is considering how projects interface with copyrighted material, especially when artists’ work appears in an algorithm’s training data without consent. Kickstarter will also consider whether the project will “exploit a particular community or put anyone at risk of harm.”
In recent months, tools like OpenAI’s ChatGPT and Stability AI’s Stable Diffusion have been met with mainstream success, bringing conversations about the ethics of AI artwork into the forefront of public debate. If apps like Lensa AI can leverage the open source Stable Diffusion to instantly create artistic avatars that look like a professional artist’s work, how does that impact those same working artists?
Some artists took to Twitter to pressure Kickstarter into dropping the Unstable Diffusion project, citing concerns about how AI art generators can threaten artists’ careers.
Many cite the fate of Greg Rutkowski’s work as an example of what can go wrong. A living illustrator who has crafted detailed, high fantasy artwork for franchises like “Dungeons & Dragons,” Rutkowski’s name was one of Stable Diffusion‘s most popular search terms when it launched in September, allowing users to easily replicate his distinctive style. Rutkowski never consented to his artwork being used to train the algorithm, leading him to become a vocal advocate about how AI art generators impact working artists.
“With $25,000 in funding, we can afford to train the new model with 75 million high quality images consisting of ~25 million anime and cosplay images, ~25 million artistic images from Artstation/DeviantArt/Behance, and ~25 million photographic pictures,” Unstable Diffusion wrote in its Kickstarter. This set off alarm bells for independent artists, many of whom post their work on websites like the ones Unstable Diffusion mentioned.
Spawning, a set of AI tools designed to support artists, developed a website called Have I Been Trained, which lets artists see if their work appears in popular datasets and opt out. Per an April court case, there is legal precedent to defend the scraping of publicly accessible data.
Despite the blow of its Kickstarter suspension, Unstable Diffusion continues to fundraise elsewhere.
“While Kickstarter’s capitulation to a loud subset of artists disappoints us, we and our supporters will not back down from defending the freedom to create,” said Unstable Diffusion CEO Arman Chaudhry in a Discord message to TechCrunch. “We have updated our new website, to allow our supporters to directly contribute to the creation and release of new artistic AI systems more powerful than ever. We are rising to the call to defend against the artists lobbying to make all AI art illegal, and backers support will allow us to challenge this increasingly well-funded and organized lobby.”
Unstable Diffusion is processing donations now directly on its website using Stripe. So far, it has raised over $15,000.
In a longer message posted to the Unstable Diffusion Discord community, which has over 97,000 members, Chaudhry warned members about growing movements from anti-AI artists.
“It seems that the anti-AI crowd is trying to silence us and stamp out our community by sending false reports to Kickstarter, Patreon, and Discord. They’ve even started a GoFundMe campaign with over $150,000 raised with the goal of lobbying governments to make AI art illegal,” he wrote.
The statement continues: “Unfortunately, we have seen other communities and companies cower in the face of these attacks. Zeipher has announced a suspension of all model releases and closed their community and Stability AI is now removing artists from Stable Diffusion 3.0. But we will not be silenced. We will not let them succeed in their efforts to stifle our creativity and innovation. Our community is strong (almost 100,000 users) and we will not be defeated by a small group of individuals who are too afraid to embrace new tools and technologies.”
Inherent problems in AI porn generation
Ethical questions about AI artwork get even murkier when considering projects like Unstable Diffusion, which center around the development of NSFW content.
Stable Diffusion uses a dataset of 2.3 billion images to train its text-to-image generator. But only an estimated 2.9% of the dataset contains NSFW material, giving the model little to go on when it comes to explicit content. That’s where Unstable Diffusion comes in. The project, which is part of Equilibrium AI, recruited volunteers from its Discord server to develop more robust porn datasets to fine-tune their algorithm, the same way you would upload more pictures of couches and chairs to a dataset if you wanted to make a furniture-generation AI.
But any AI generator is prone to fall victim to whatever biases the humans behind the algorithm have. Much of the porn that’s free and easily accessible online is developed for the male gaze, which means that’s likely what the AI will spit out, especially if those are the kinds of images that users are inputting into the dataset.
In its now-suspended Kickstarter, Unstable Diffusion said that it would work toward making an AI art model that can “better handle human anatomy, generate in diverse and controllable artistic styles, represent under-trained concepts like LGBTQ and races and genders more fairly.”
Plus, there’s no way of verifying whether much of the porn that’s freely available on the internet was made consensually (however, adult creators who use paid platforms like OnlyFans and ManyVids must verify their age and identity before using these services). Even then, if a model consents to appearing in porn, that doesn’t mean that they consent to their images being used to train an AI. While this technology can create stunningly realistic images, that also means that it can be weaponized to make nonconsensual deepfake pornography.
Currently, few laws around the world pertain to nonconsensual deepfaked porn. In the U.S., only Virginia and California have regulations restricting certain uses of faked and deepfaked pornographic media.
“One aspect that I’m particularly worried about is the disparate impact AI-generated porn has on women,” Ravit Dotan, VP of responsible AI at Mission Control, told TechCrunch last month. “For example, a previous AI-based app that can ‘undress’ people works only on women.”
Updated, 12/22/22, 9:55 AM ET with statement from Unstable Diffusion.
Please login to comment
Login / Create Account