OpenAI is preparing to release its first open-weight language model since GPT-2, signaling a shift back toward transparency as the company navigates pressure from investors, mounting competition, and structural change. CEO Sam Altman revealed the upcoming release in a post on X, calling the model “pretty capable” at reasoning and asking developers and researchers to share feedback on its capabilities and structure.
This new model will not be fully open-source; developers will be provided with pre-trained weights, but not training data or underlying code. The move reflects OpenAI’s evolving strategy—appeasing a developer community that values openness while delivering returns to stakeholders in a business environment shaped by rapid growth and scrutiny.
TL;DR: we are excited to release a powerful new open-weight language model with reasoning in the coming months, and we want to talk to devs about how to make it maximally useful: https://t.co/XKB4XxjREV
— Sam Altman (@sama) March 31, 2025
we are excited to make this a very, very good model!
__
we are planning to…
A Funding Deal with Strings Attached
The timing of the announcement coincides with OpenAI’s largest-ever fundraising initiative: a deal to raise up to $40 billion led by SoftBank. The funding reportedly includes a $20 billion redemption clause to return capital to early investors and employees—a component not publicly disclosed at the deal’s launch.
Under the terms of the agreement, OpenAI must complete its transition to a Public Benefit Corporation (PBC) by the end of 2025 to receive the full investment. Failure to do so could reduce the capital infusion by half. The Financial Times reports that SoftBank is providing 75% of the $40 billion total, with the remainder coming from Microsoft and funds including Altimeter, Coatue, and Thrive Capital.
With the deal pushing OpenAI’s valuation to an estimated $300 billion—comparable to some of the largest companies in the S&P 500—the release of an open-weight model may serve both as a public relations tool and a strategic hedge against growing developer dissatisfaction with closed systems. As Altman stated, “We’d like to hear from developers, researchers, and companies who’d like to build on it.”
In a separate statement on the funding, he remarked, “This investment helps us push the frontier and make AI more useful in everyday life.”
ChatGPT’s Multimodal Capabilities Broaden Its Reach
Even as OpenAI teases more openness, its core product continues to expand. The company recently added enhanced image generation and editing tools powered by GPT-4o to ChatGPT. These tools allow users to generate, transform, and refine visuals—including inpainting and background editing—within the same interface used for text interactions.
According to OpenAI’s GPT-4o system card, “4o image generation is a new, significantly more capable image generation approach than our earlier DALL·E 3 series of models. It can create photorealistic output. It can take images as inputs and transform them.”
The system includes watermarking and moderation filters to prevent misuse. With support for free-tier users, the feature appears designed to drive growth—and it’s working. On launch day, ChatGPT gained one million new users in just one hour, according to Sam Altman.
Breaking from Microsoft and Building a New Backbone
OpenAI is also changing how it handles compute. After Microsoft declined to renew a major cloud contract with GPU provider CoreWeave, OpenAI stepped in and finalized an $11.9 billion compute deal with the company. In addition to securing five years of compute capacity, OpenAI invested $350 million in CoreWeave ahead of its IPO.
The company is also working with TSMC and Broadcom to develop its first in-house AI chip, due later this year, according to Winbuzzer. These moves indicate an effort to reduce dependence on Azure and cut long-term infrastructure costs, especially as model sizes continue to scale.
Further aligning with government initiatives, OpenAI is a key partner in the $500 billion Stargate Project, a Trump-endorsed infrastructure effort focused on building data centers and chip fabs in the U.S. The first phase of the project, worth $100 billion, will be located in Texas and could help OpenAI anchor its compute ambitions domestically.
Legal Tensions and Copyright Pushback
While infrastructure and valuation grow, OpenAI faces escalating legal scrutiny. In late March, a federal judge cleared the way for The New York Times’ copyright lawsuit to proceed. The paper alleges that OpenAI and Microsoft used millions of its articles without authorization to train AI models. OpenAI denies the claims, stating that its systems tokenize inputs and do not replicate content in full.
To mitigate these concerns, OpenAI has signed licensing deals with major media outlets including TIME, Vox Media, and the Financial Times. However, its long-promised “Media Manager” tool—which would give publishers a way to opt out of model training—has yet to materialize, missing its original delivery deadline.
Leadership Changes and Competitive Pressure
Internally, OpenAI is repositioning itself. On March 24, COO Brad Lightcap took over day-to-day business operations, freeing Sam Altman to focus on technical direction. The company also appointed Julia Villagra as Chief People Officer and promoted former VP Mark Chen to Chief Research Officer.
This internal restructuring coincides with growing competitive threats. Meta’s LLaMA models and China’s DeepSeek R1—both released with open weights—are challenging OpenAI’s dominance. OpenAI’s move toward open-weight releases is partly a reaction to DeepSeek’s traction among developers in global markets. DeepSeek is expected to launch the successor of its highly performing R1 open-weight reasoning model soon, dubbed R2.
OpenAI has not disclosed the licensing terms for its upcoming model, and it remains unclear whether it will allow redistribution or commercial use. Even with weights made available, developers won’t have access to the training dataset, leaving questions about how customizable or extensible the model truly is.