Build with AI
AI coding assistants work best when they have access to current documentation. Strands Agents provides two ways to give your AI tools the context they need: an MCP server for interactive documentation search, and llms.txt files for bulk documentation access.
Strands Agents MCP Server
Section titled “Strands Agents MCP Server”The Strands Agents MCP server gives AI coding assistants direct access to the Strands Agents documentation through the Model Context Protocol (MCP). It provides intelligent search with TF-IDF based ranking, section-based browsing for token-efficient retrieval, and on-demand content fetching so your AI tools can find and retrieve exactly the documentation they need.
Prerequisites
Section titled “Prerequisites”The MCP server requires uv to be installed on your system. Follow the official installation instructions to set it up.
Choose your AI coding tool below and follow the setup instructions.
You can use the Strands Agents MCP server as a tool within your own Strands agents:
from mcp import stdio_client, StdioServerParametersfrom strands import Agentfrom strands.tools.mcp import MCPClient
mcp_client = MCPClient(lambda: stdio_client( StdioServerParameters( command="uvx", args=["strands-agents-mcp-server"] )))
agent = Agent(tools=[mcp_client])agent("How do I create a custom tool in Strands Agents?")See the MCP tools documentation for more details on using MCP tools with Strands agents.
Add the following to ~/.kiro/settings/mcp.json:
{ "mcpServers": { "strands-agents": { "command": "uvx", "args": ["strands-agents-mcp-server"], "disabled": false, "autoApprove": ["search_docs", "fetch_doc"] } }}See the Kiro MCP documentation for more details.
Run the following command:
claude mcp add strands uvx strands-agents-mcp-serverSee the Claude Code MCP documentation for more details.
Add the following to ~/.aws/amazonq/mcp.json:
{ "mcpServers": { "strands-agents": { "command": "uvx", "args": ["strands-agents-mcp-server"], "disabled": false, "autoApprove": ["search_docs", "fetch_doc"] } }}See the Q Developer CLI MCP documentation for more details.
Add the following to ~/.cursor/mcp.json:
{ "mcpServers": { "strands-agents": { "command": "uvx", "args": ["strands-agents-mcp-server"] } }}See the Cursor MCP documentation for more details.
Add the following to your mcp.json file:
{ "servers": { "strands-agents": { "command": "uvx", "args": ["strands-agents-mcp-server"] } }}See the VS Code MCP documentation for more details.
The Strands Agents MCP server works with 40+ applications that support MCP. The general configuration is:
- Command:
uvx - Args:
["strands-agents-mcp-server"]
Verify the connection
Section titled “Verify the connection”You can test the MCP server using the MCP Inspector:
npx @modelcontextprotocol/inspector uvx strands-agents-mcp-serverllms.txt files
Section titled “llms.txt files”The Strands Agents documentation site provides llms.txt files optimized for AI consumption. These are static files containing the full documentation in plain markdown, suitable for feeding directly into an LLM’s context window.
Available endpoints
Section titled “Available endpoints”| Endpoint | Description |
|---|---|
/llms.txt | Index file with links to all documentation pages in raw markdown format |
/llms-full.txt | Complete documentation content in a single file (excludes API reference) |
Raw markdown convention
Section titled “Raw markdown convention”Every documentation page is available in raw markdown format by appending /index.md to its URL path:
/docs/user-guide/quickstart/→/docs/user-guide/quickstart/index.md/docs/user-guide/concepts/tools/→/docs/user-guide/concepts/tools/index.md
This gives you clean markdown content without HTML markup, navigation, or styling.
When to use llms.txt
Section titled “When to use llms.txt”The llms.txt files are useful when:
- Your AI tool does not support MCP
- You want to provide full documentation context in a single prompt
- You are building custom tooling around the documentation
Tips for AI-assisted Strands development
Section titled “Tips for AI-assisted Strands development”- Use the MCP server over llms.txt when possible — it retrieves only the relevant sections, saving tokens and improving accuracy.
- Start from examples — point your AI tool at the examples for common patterns like multi-agent systems, structured output, and tool use.
- Review AI-generated code — always verify that generated code follows the patterns in the official documentation, especially for model provider configuration and tool definitions.
- Use project rules — many AI coding tools support project-level instructions (e.g.,
.cursorrules,CLAUDE.md). Add Strands-specific conventions to keep AI output consistent across your project.