Skip to content

Build with AI

AI coding assistants work best when they have access to current documentation. Strands Agents provides two ways to give your AI tools the context they need: an MCP server for interactive documentation search, and llms.txt files for bulk documentation access.

The Strands Agents MCP server gives AI coding assistants direct access to the Strands Agents documentation through the Model Context Protocol (MCP). It provides intelligent search with TF-IDF based ranking, section-based browsing for token-efficient retrieval, and on-demand content fetching so your AI tools can find and retrieve exactly the documentation they need.

The MCP server requires uv to be installed on your system. Follow the official installation instructions to set it up.

Choose your AI coding tool below and follow the setup instructions.

You can use the Strands Agents MCP server as a tool within your own Strands agents:

from mcp import stdio_client, StdioServerParameters
from strands import Agent
from strands.tools.mcp import MCPClient
mcp_client = MCPClient(lambda: stdio_client(
StdioServerParameters(
command="uvx",
args=["strands-agents-mcp-server"]
)
))
agent = Agent(tools=[mcp_client])
agent("How do I create a custom tool in Strands Agents?")

See the MCP tools documentation for more details on using MCP tools with Strands agents.

You can test the MCP server using the MCP Inspector:

Terminal window
npx @modelcontextprotocol/inspector uvx strands-agents-mcp-server

The Strands Agents documentation site provides llms.txt files optimized for AI consumption. These are static files containing the full documentation in plain markdown, suitable for feeding directly into an LLM’s context window.

EndpointDescription
/llms.txtIndex file with links to all documentation pages in raw markdown format
/llms-full.txtComplete documentation content in a single file (excludes API reference)

Every documentation page is available in raw markdown format by appending /index.md to its URL path:

This gives you clean markdown content without HTML markup, navigation, or styling.

The llms.txt files are useful when:

  • Your AI tool does not support MCP
  • You want to provide full documentation context in a single prompt
  • You are building custom tooling around the documentation
  • Use the MCP server over llms.txt when possible — it retrieves only the relevant sections, saving tokens and improving accuracy.
  • Start from examples — point your AI tool at the examples for common patterns like multi-agent systems, structured output, and tool use.
  • Review AI-generated code — always verify that generated code follows the patterns in the official documentation, especially for model provider configuration and tool definitions.
  • Use project rules — many AI coding tools support project-level instructions (e.g., .cursorrules, CLAUDE.md). Add Strands-specific conventions to keep AI output consistent across your project.