Vercel AI SDK
The Vercel AI SDK is a TypeScript toolkit for building AI-powered applications. It defines a Language Model Specification that standardizes how applications interact with LLMs across providers. The Strands Agents SDK includes a VercelModel adapter that wraps any Language Model Specification v3 (LanguageModelV3) provider for use as a Strands model provider.
This means you can bring models from the entire Vercel AI SDK ecosystem - including @ai-sdk/openai, @ai-sdk/anthropic, @ai-sdk/amazon-bedrock, @ai-sdk/google, and many more - directly into Strands agents.
Installation
Section titled “Installation”Install the Strands SDK along with the Vercel AI SDK provider package for the model you want to use:
# OpenAInpm install @strands-agents/sdk @ai-sdk/openai
# Amazon Bedrocknpm install @strands-agents/sdk @ai-sdk/amazon-bedrock
# Anthropicnpm install @strands-agents/sdk @ai-sdk/anthropic
# Google Generative AInpm install @strands-agents/sdk @ai-sdk/googleThe @ai-sdk/provider package (which defines the LanguageModelV3 interface) is listed as an optional peer dependency of @strands-agents/sdk and will be installed automatically with any @ai-sdk/* provider.
Create a LanguageModelV3 instance from any Vercel provider and wrap it with VercelModel:
OpenAI
Section titled “OpenAI”import { Agent } from '@strands-agents/sdk'import { VercelModel } from '@strands-agents/sdk/vercel'import { openai } from '@ai-sdk/openai'
const agent = new Agent({ model: new VercelModel(openai('gpt-4o')),})
const result = await agent.invoke('Hello!')console.log(result)Amazon Bedrock
Section titled “Amazon Bedrock”import { Agent } from '@strands-agents/sdk'import { VercelModel } from '@strands-agents/sdk/vercel'import { bedrock } from '@ai-sdk/amazon-bedrock'
const agent = new Agent({ model: new VercelModel(bedrock('us.anthropic.claude-sonnet-4-20250514-v1:0')),})
const result = await agent.invoke('Hello!')console.log(result)Anthropic
Section titled “Anthropic”import { Agent } from '@strands-agents/sdk'import { VercelModel } from '@strands-agents/sdk/vercel'import { anthropic } from '@ai-sdk/anthropic'
const agent = new Agent({ model: new VercelModel(anthropic('claude-sonnet-4-20250514')),})
const result = await agent.invoke('Hello!')console.log(result)Google Generative AI
Section titled “Google Generative AI”import { Agent } from '@strands-agents/sdk'import { VercelModel } from '@strands-agents/sdk/vercel'import { google } from '@ai-sdk/google'
const agent = new Agent({ model: new VercelModel(google('gemini-2.5-flash')),})
const result = await agent.invoke('Hello!')console.log(result)Configuration
Section titled “Configuration”The second argument to VercelModel accepts configuration options. These include all LanguageModelV3CallOptions settings (temperature, topP, topK, penalties, stop sequences, seed, etc.) plus the base Strands model config fields.
const model = new VercelModel(openai('gpt-4o'), { maxTokens: 1000, temperature: 0.7, topP: 0.9,})
const agent = new Agent({ model })const result = await agent.invoke('Write a short poem')console.log(result)| Parameter | Description | Example |
|---|---|---|
modelId | Override the model ID (defaults to the provider’s model ID) | 'gpt-4o' |
maxTokens | Maximum tokens to generate | 1000 |
temperature | Controls randomness | 0.7 |
topP | Nucleus sampling | 0.9 |
topK | Top-k sampling | 40 |
presencePenalty | Encourages new topics | 0.5 |
frequencyPenalty | Reduces repetition | 0.5 |
stopSequences | Custom stop sequences | ['END'] |
seed | Deterministic generation | 42 |
When new fields are added to the Language Model Specification, they become available in the config automatically.
Streaming
Section titled “Streaming”The adapter supports streaming text, reasoning content, and tool use:
const agent = new Agent({ model: new VercelModel(openai('gpt-4o')),})
for await (const event of agent.stream('Tell me a story')) { if (event.type === 'modelContentBlockDeltaEvent' && event.delta.type === 'textDelta') { process.stdout.write(event.delta.text) }}Supported features
Section titled “Supported features”The VercelModel adapter handles:
- Streaming text, reasoning, and tool use (both incremental and complete tool call events)
- Message formatting: text, images, documents, video, tool use/results, and reasoning blocks
- Tool specification and tool choice mapping
- Usage and token tracking including cache read/write tokens
- Error classification: maps provider errors to
ModelThrottledError,ContextWindowOverflowError, andModelError
Compatible providers
Section titled “Compatible providers”Any package that implements the LanguageModelV3 interface works with VercelModel. This includes all official Vercel AI SDK providers and community providers:
| Provider | Package |
|---|---|
| OpenAI | @ai-sdk/openai |
| Amazon Bedrock | @ai-sdk/amazon-bedrock |
| Anthropic | @ai-sdk/anthropic |
| Google Generative AI | @ai-sdk/google |
| Google Vertex | @ai-sdk/google-vertex |
| Azure OpenAI | @ai-sdk/azure |
| Mistral | @ai-sdk/mistral |
| Cohere | @ai-sdk/cohere |
| xAI Grok | @ai-sdk/xai |
| DeepSeek | @ai-sdk/deepseek |
| Groq | @ai-sdk/groq |
See the Vercel AI SDK providers page for the full list.
Troubleshooting
Section titled “Troubleshooting”Missing peer dependency
Section titled “Missing peer dependency”If you see warnings about @ai-sdk/provider, install it explicitly:
npm install @ai-sdk/providerAuthentication errors
Section titled “Authentication errors”Authentication is handled by the underlying Vercel provider package. Refer to the specific provider’s documentation for credential setup - for example, @ai-sdk/openai reads OPENAI_API_KEY from the environment, and @ai-sdk/amazon-bedrock uses the standard AWS credential chain.