AWS Releases Open Source Model Context Protocol Servers to Enhance AI Agents

AWS has launched open-source Model Context Protocol (MCP) servers, enabling AI assistants like Amazon Q and Claude to access real-time AWS data and services.

Amazon Web Services (AWS) has launched a collection of open-source servers utilizing the Model Context Protocol (MCP), aiming to improve how AI-powered coding assistants interact with AWS services and data. Detailed in the awslabs/mcp GitHub repository and released under an Apache-2.0 license, these servers provide a standardized way for AI agents to access accurate, real-time AWS context, potentially speeding up cloud development workflows and improving code quality.

Bridging AI and Cloud Data with an Open Standard

The core technology, the Model Context Protocol, was first introduced by Anthropic in November 2024. It addresses the common issue of AI models lacking access to necessary external information or tools. As the official MCP documentation states, “The Model Context Protocol (MCP) is an open protocol that enables seamless integration between LLM applications and external data sources and tools… MCP provides a standardized way to connect LLMs with the context they need.”

Anthropic continues to steward the open-source protocol project. Instead of building numerous custom integrations, developers can use MCP clients (within AI assistants) to connect to MCP servers over HTTP, which expose specific functions or data access points.

New AWS Servers Target Specific Cloud Tasks

The initial release from AWS includes several servers focused on distinct areas:

  • Core MCP Server: Acts as a coordinator for managing other AWS MCP servers. (Docs)
  • AWS Documentation: Provides access to current AWS docs via the official search API. (Docs)
  • Amazon Bedrock Knowledge Bases Retrieval: Enables querying of private enterprise data hosted in Bedrock for Retrieval-Augmented Generation (RAG). Bedrock is AWS’s managed service for foundation models. (Docs)
  • AWS CDK & AWS Terraform: Offer tools for Infrastructure as Code (IaC), including Checkov integration in the Terraform server for security analysis. (CDK Docs, Terraform Docs)
  • Cost Analysis: Allows natural language queries about AWS spending. (Docs)
  • Amazon Nova Canvas: Integrates with Amazon’s own image generation model, part of its Nova AI family. (Docs)
  • AWS Diagram: Aids in creating architecture diagrams via Python code. (Docs)
  • AWS Lambda: Lets AI agents trigger specific Lambda functions as tools. (Docs)

The intention, according to an AWS blog post about the launch, is that this protocol allows AI assistants to use specialized tooling and access domain-specific knowledge “all while keeping sensitive data local.”

Setup and Ecosystem Integration

Setting up these servers requires installing the `uv` package utility from Astral, ensuring Python 3.10+ is available, and configuring appropriate AWS credentials. The servers themselves are typically executed using the `uvx` command (which runs packages in temporary environments) via packages hosted on PyPI. Configuration happens within the client tool, using JSON files like ~/.aws/amazonq/mcp.json for the Amazon Q CLI, ~/.cursor/mcp.json for the Cursor editor, or ~/.codeium/windsurf/mcp_config.json for Windsurf. AWS also mentions support for Anthropic’s Claude Desktop app and Cline. Developers can find specific setup guidance and code samples in the repository.

Wider Adoption and Considerations

AWS is not the only major cloud provider building on MCP. Microsoft integrated the protocol into Azure AI in March 2025 and developed an official C# SDK. Microsoft has also connected MCP to tools like its Semantic Kernel framework and, just days ago on April 18th, previewed its own MCP servers for Azure services.

This growing support points to MCP potentially becoming a common layer for AI-cloud interaction. While standardizing the interface, practical use still requires attention to potential HTTP latency for some applications and the need for developers to implement robust error handling and security around the server interactions. Amazon’s strategy appears multifaceted, complementing this open standard adoption with continued development of its internal Nova AI models and tools like the Nova Act SDK.

Last Updated on April 20, 2025 7:03 pm CEST

Markus Kasanmascheff
Markus Kasanmascheff
Markus has been covering the tech industry for more than 15 years. He is holding a Master´s degree in International Economics and is the founder and managing editor of Winbuzzer.com.

Recent News

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments
0
We would love to hear your opinion! Please comment below.x
()
x