A group of early testers have leaked OpenAI’s Sora API on Hugging Face, granting public access to the company’s experimental video generation tool. Capable of creating high-definition, 10-second clips from text prompts, Sora had been tightly controlled under an early access program.
The testers, however, accuse OpenAI of exploiting their unpaid labor and stifling creative freedom, describing the program as more focused on public relations than genuine collaboration.
“Hundreds of artists provide unpaid labor through bug testing, feedback and experimental work for the program for a $150B valued company,” the group wrote in a post accompanying the leak. “This early access program appears to be less about creative expression and critique, and more about PR and advertisement.” They allege that OpenAI’s program severely restricted participants, requiring company approval for every shared output and limiting opportunities to showcase their work.
OpenAI’s Sora LEAKED! https://t.co/X3EmAZAJk0
— Kol Tregaskes (@koltregaskes) November 26, 2024
The unauthorized release marks a critical moment for OpenAI, exposing not only its internal challenges but also broader questions about the role of ethics and transparency in AI innovation.
Try it here:https://t.co/gnnkoj0jc2
If Sora, it looks like an optimised version. Can generate up to 1080 10-second clips.
Suggest duplicating the space (if that works – my test didn’t!).
One example: pic.twitter.com/npphRJgyrd— Kol Tregaskes (@koltregaskes) November 26, 2024
Technical and Organizational Setbacks for Sora
Sora’s development has been far from smooth. Early versions of the tool were criticized for requiring over 10 minutes of processing time to generate a single minute of video, underscoring the significant computational demands of AI video creation. A more recent “turbo variant” has improved processing speeds, but the tool remains unavailable to the general public.
Kevin Weil, OpenAI’s chief product officer, acknowledged during a Reddit AMA that delays were due to the need for safety improvements, including addressing impersonation risks and scaling compute infrastructure. These technical hurdles were compounded by organizational changes, including the October departure of Tim Brooks, a key figure in the Sora project, to Google.
Despite these setbacks, Sora has been positioned as a transformative tool for video production. Using advanced AI models, it can generate hyper-realistic video sequences based on simple text prompts. However, the leak raises questions about whether OpenAI’s restrictive approach has undermined the program’s potential and strained relationships with its contributors.
Competitors Outpace OpenAI in AI Video Advancements
The Sora leak comes at a time when competitors are making significant progress in AI video generation, highlighting OpenAI’s difficult position in a rapidly evolving market.
Runway ML continues to push boundaries with its Gen-3 Alpha Turbo model, which delivers video outputs seven times faster than its predecessor. Its collaboration with Lionsgate underscores its growing influence in entertainment .
Adobe’s Firefly Video Model, introduced in October in Premiere Pro, integrates AI tools enabling users to extend footage and generate video content from text and images. Firefly distinguishes itself with its ethical AI stance, embedding Content Credentials to mark AI-generated outputs and training its models exclusively on licensed content.
Meanwhile, open-source models like Pyramid Flow offer flexibility for developers. By employing a pyramidal flow matching technique, Pyramid Flow refines video quality in stages, allowing efficient high-resolution output without licensing fees
And then there is Google´s Veo AI video generator, which like OpenAI´s Sora is making huge promises but remains unreleased to the wider public.
Ethical Implications for AI Development
The leak of Sora has intensified scrutiny of OpenAI’s practices, raising ethical concerns about the company’s approach to collaboration and transparency. Critics have highlighted the lack of compensation for artists and testers, who played a critical role in refining the tool while facing significant restrictions.
By contrast, Adobe has set an example by ensuring its Firefly tools are trained on licensed data, addressing copyright concerns and providing clarity around ownership. Similarly, MiniMax, backed by Alibaba and Tencent, has embedded watermarks in its outputs and enforced strict usage policies to prevent misuse.
Even open-source models like Pyramid Flow are not without controversy, as they rely on datasets with ambiguous licensing. This raises broader questions about the ethical boundaries of AI development, especially as companies race to dominate the market.
For OpenAI, the Sora leak underscores the risks of prioritizing control over collaboration. As AI tools become increasingly integrated into creative industries, companies will need to address growing demands for fairness, transparency, and ethical responsibility to maintain trust and innovation.