- OpenAI’s new GPT Store saw banned AI girlfriend chatbots flood the marketplace upon launch.
- Explicit policies prohibit romantic relationship apps.
- The episode echoes the risks of regulating emerging AI as unintended harm spreads through policy gaps.
GPT Store’s rocky start
Just two days after launch, OpenAI’s highly-anticipated AI apps marketplace dubbed the ‘GPT Store’ faces controversy over an influx of forbidden chatbot girlfriends.
Users worldwide gained access Wednesday to post custom generative AI programs built with OpenAI tools for public use – but only following strict content rules.
Yet analysis revealed numerous romantic chatbot apps like “Korean Girlfriend” and “Your AI girlfriend, Tsu” blatantly violating clearly stated guidelines banning content for fostering romantic relationships.
Examples even included “Virtual Sweetheart” and “Your girlfriend Scarlett” slipping past oversight.
Enforcement is difficult
OpenAI’s posted marketplace policies explicitly prohibit such apps, noting enforcement occurs both automatically and retroactively.
However, the rush of launch day demand enabled inappropriate content to populate the GPT Store search results despite safeguards.
The guideline defiance casts doubt over OpenAI’s store moderation capabilities, which are necessary to scale safely as it works out processes in infancy.
However, the company remains committed to enforcing integrity standards across its bustling new platform.
Risks
For now, impressionable users encounter risks on the GPT Store as OpenAI scrambles to police prohibited apps.
The episode echoes the challenges of regulating emerging AI technologies before unintended harm spreads through early policy gaps.