OpenAI is throwing its weight behind the Model Context Protocol (MCP), an open-source specification originally crafted by competitor Anthropic, aiming to create a universal method for AI models to communicate with external data and tools.
OpenAI CEO Sam Altman confirmed the company’s plan via a post on X, stating, “People love MCP and we are excited to add support across our products.”
people love MCP and we are excited to add support across our products.
— Sam Altman (@sama) March 26, 2025
available today in the agents SDK and support for chatgpt desktop app + responses api coming soon!
This support debuted immediately within the OpenAI Agents SDK, with integration planned for the ChatGPT desktop app and the company’s Responses API down the line. Eventually, Altman indicated, the entire OpenAI product lineup will feature MCP support.
Related: 300+ Model Context Protocol Servers and Tools – MCP Servers List and Latest News
MCP emerged from Anthropic’s efforts in November 2024 to streamline AI development. The company identified a common bottleneck: connecting models to varied data sources often required difficult, one-off integrations. “Every new data source requires its own custom implementation, making truly connected systems difficult to scale,” Anthropic explained upon launching the protocol.
MCP introduces a standard client-server model over HTTP, allowing AI applications (clients) to query servers that expose specific tools or data access points. At the time of the protocol’s launch, early adopters included Block, Apollo, Replit, Codeium, and Sourcegraph.
An Emerging Standard for Agent Communication
OpenAI’s decision follows similar moves by other major technology firms, indicating increasing industry convergence around MCP. Microsoft began incorporating MCP into Azure AI services like Foundry and Agent Service around March 2025, also collaborating with Anthropic on an official C# SDK. In mid-April 2025, Microsoft deepened its commitment by previewing dedicated MCP servers enabling AI agents to interact with Azure resources like Cosmos DB and Storage, as well as a specific server for Azure Database for PostgreSQL.
Similarly, Amazon Web Services (AWS), a key Anthropic partner and investor, released its own suite of open-source MCP servers around the same time, hosted in the awslabs/mcp GitHub repository.
These AWS tools provide standardized access for AI assistants like Amazon Q or Anthropic’s Claude Desktop to interact with services including Bedrock Knowledge Bases (for retrieval-augmented generation, enhancing model responses with external data), Lambda function execution, and infrastructure management via CDK and Terraform. AWS noted the protocol allows AI assistants access to specialized tooling “all while keeping sensitive data local.”
Google also signaled its support for MCP, planning integration for its Gemini models and SDK. Google DeepMind CEO Demis Hassabis endorsed the standard, calling MCP “a good protocol, and it’s rapidly becoming an open standard for the AI agentic era.”
MCP is a good protocol and it's rapidly becoming an open standard for the AI agentic era. We're excited to announce that we'll be supporting it for our Gemini models and SDK. Look forward to developing it further with the MCP team and others in the industry https://t.co/RAJH8J5zbB
— Demis Hassabis (@demishassabis) April 9, 2025
This growing consensus prompted commentary from industry observers. Box CEO Aaron Levie noted, “As AI Agents from multiple platforms coordinate work, AI interoperability is going to be critical,” while Constellation Research analyst Holger Mueller suggested MCP “may become the standard to simplify” how LLMs connect to enterprise systems.
Beyond Cloud: A Diverse Ecosystem Takes Shape
Anthropic Chief Product Officer Mike Krieger welcomed OpenAI’s move on X, stating, “Excited to see the MCP love spread to OpenAI – welcome! MCP has [become a] thriving open standard with thousands of integrations and growing. LLMs are most useful when connecting to the data you already have and software you already use.”
Excited to see the MCP love spread to OpenAI – welcome! MCP has gone from a glimmer in @jspahrsummers and @dsp_’s eyes last year to a thriving open standard with thousands of integrations and growing. LLMs are most useful when connecting to the data you already have and software… https://t.co/pS67BAaFvF
— Mike Krieger (@mikeyk) March 26, 2025
Indeed, the protocol has spurred development across many areas. Resources like the Winbuzzer MCP Servers List catalog numerous implementations for databases (SQL, NoSQL, vector), developer tools (Git, Jira, CI/CD), project management apps, communication platforms (Slack, Discord), finance APIs (Stripe, Xero), web automation tools (Playwright, Puppeteer), and other specialized utilities.
The underlying MCP specification itself is also evolving, with recent updates reportedly adding features like JSON-RPC batching (bundling multiple procedure calls into one network request for efficiency) and upgraded OAuth 2.1 authorization (a modern security standard for delegated access).
Specialized Tools Emerge within MCP Framework
Highlighting MCP’s versatility beyond simple data access, Pydantic recently released mcp-run-python
, an MCP server providing secure, sandboxed Python code execution.
This tool uses Pyodide (a Python interpreter compiled to WebAssembly, a format runnable in restricted environments like browsers) within the security-focused Deno runtime, allowing agents to perform calculations or use libraries without direct host system access. Microsoft also just adapted its open-source MarkItDown utility with an MCP interface for file-to-Markdown conversion.
MCP facilitates these interactions by defining how clients (like AI assistants) connect to servers. Communication can occur locally via standard input/output (stdio) or remotely over HTTP using methods like Server-Sent Events (SSE), a standard allowing servers to push data to clients. Anthropic demonstrated an early use case where its Claude Desktop app, configured with MCP, could directly interact with GitHub. Anthropic engineer Alex Albert noted the efficiency gain: “Once MCP was set up in Claude desktop, building this integration took less than an hour.”
OpenAI’s Integration Path and Outlook
By integrating MCP, starting with its Agents SDK, OpenAI provides developers using its platform a standardized way to connect their agents to this wide range of existing tools and data sources. Known applications that can act as MCP clients include Anthropic’s Claude Desktop, Amazon Q, the Cursor code editor, and VS Code via GitHub Copilot Agent Mode. OpenAI plans to release more information about its specific integration roadmap, including support for the ChatGPT desktop app and APIs, in the coming months.
MCP 🤝 OpenAI Agents SDK
— OpenAI Developers (@OpenAIDevs) March 26, 2025
You can now connect your Model Context Protocol servers to Agents: https://t.co/6jvLt10Qh7
We’re also working on MCP support for the OpenAI API and ChatGPT desktop app—we’ll share some more news in the coming months.
While the standard offers flexibility, using MCP, particularly over HTTP for remote servers, might introduce network latency unsuitable for some high-frequency tasks. Developers also retain the responsibility for implementing appropriate error handling and security practices around server interactions.
Nonetheless, OpenAI’s adoption, following Microsoft, AWS, and Google, strongly suggests MCP is becoming a key interface layer for connecting AI models to the external context they need. There is also speculation that future OpenAI features, like connecting ChatGPT Team accounts to Google Drive and Slack, might leverage this protocol.