Skip to content

Vercel AI SDK

The Vercel AI SDK is a TypeScript toolkit for building AI-powered applications. It defines a Language Model Specification that standardizes how applications interact with LLMs across providers. The Strands Agents SDK includes a VercelModel adapter that wraps any Language Model Specification v3 (LanguageModelV3) provider for use as a Strands model provider.

This means you can bring models from the entire Vercel AI SDK ecosystem - including @ai-sdk/openai, @ai-sdk/anthropic, @ai-sdk/amazon-bedrock, @ai-sdk/google, and many more - directly into Strands agents.

Install the Strands SDK along with the Vercel AI SDK provider package for the model you want to use:

Terminal window
# OpenAI
npm install @strands-agents/sdk @ai-sdk/openai
# Amazon Bedrock
npm install @strands-agents/sdk @ai-sdk/amazon-bedrock
# Anthropic
npm install @strands-agents/sdk @ai-sdk/anthropic
# Google Generative AI
npm install @strands-agents/sdk @ai-sdk/google

The @ai-sdk/provider package (which defines the LanguageModelV3 interface) is listed as an optional peer dependency of @strands-agents/sdk and will be installed automatically with any @ai-sdk/* provider.

Create a LanguageModelV3 instance from any Vercel provider and wrap it with VercelModel:

import { Agent } from '@strands-agents/sdk'
import { VercelModel } from '@strands-agents/sdk/vercel'
import { openai } from '@ai-sdk/openai'
const agent = new Agent({
model: new VercelModel(openai('gpt-4o')),
})
const result = await agent.invoke('Hello!')
console.log(result)
import { Agent } from '@strands-agents/sdk'
import { VercelModel } from '@strands-agents/sdk/vercel'
import { bedrock } from '@ai-sdk/amazon-bedrock'
const agent = new Agent({
model: new VercelModel(bedrock('us.anthropic.claude-sonnet-4-20250514-v1:0')),
})
const result = await agent.invoke('Hello!')
console.log(result)
import { Agent } from '@strands-agents/sdk'
import { VercelModel } from '@strands-agents/sdk/vercel'
import { anthropic } from '@ai-sdk/anthropic'
const agent = new Agent({
model: new VercelModel(anthropic('claude-sonnet-4-20250514')),
})
const result = await agent.invoke('Hello!')
console.log(result)
import { Agent } from '@strands-agents/sdk'
import { VercelModel } from '@strands-agents/sdk/vercel'
import { google } from '@ai-sdk/google'
const agent = new Agent({
model: new VercelModel(google('gemini-2.5-flash')),
})
const result = await agent.invoke('Hello!')
console.log(result)

The second argument to VercelModel accepts configuration options. These include all LanguageModelV3CallOptions settings (temperature, topP, topK, penalties, stop sequences, seed, etc.) plus the base Strands model config fields.

const model = new VercelModel(openai('gpt-4o'), {
maxTokens: 1000,
temperature: 0.7,
topP: 0.9,
})
const agent = new Agent({ model })
const result = await agent.invoke('Write a short poem')
console.log(result)
ParameterDescriptionExample
modelIdOverride the model ID (defaults to the provider’s model ID)'gpt-4o'
maxTokensMaximum tokens to generate1000
temperatureControls randomness0.7
topPNucleus sampling0.9
topKTop-k sampling40
presencePenaltyEncourages new topics0.5
frequencyPenaltyReduces repetition0.5
stopSequencesCustom stop sequences['END']
seedDeterministic generation42

When new fields are added to the Language Model Specification, they become available in the config automatically.

The adapter supports streaming text, reasoning content, and tool use:

const agent = new Agent({
model: new VercelModel(openai('gpt-4o')),
})
for await (const event of agent.stream('Tell me a story')) {
if (event.type === 'modelContentBlockDeltaEvent' && event.delta.type === 'textDelta') {
process.stdout.write(event.delta.text)
}
}

The VercelModel adapter handles:

  • Streaming text, reasoning, and tool use (both incremental and complete tool call events)
  • Message formatting: text, images, documents, video, tool use/results, and reasoning blocks
  • Tool specification and tool choice mapping
  • Usage and token tracking including cache read/write tokens
  • Error classification: maps provider errors to ModelThrottledError, ContextWindowOverflowError, and ModelError

Any package that implements the LanguageModelV3 interface works with VercelModel. This includes all official Vercel AI SDK providers and community providers:

ProviderPackage
OpenAI@ai-sdk/openai
Amazon Bedrock@ai-sdk/amazon-bedrock
Anthropic@ai-sdk/anthropic
Google Generative AI@ai-sdk/google
Google Vertex@ai-sdk/google-vertex
Azure OpenAI@ai-sdk/azure
Mistral@ai-sdk/mistral
Cohere@ai-sdk/cohere
xAI Grok@ai-sdk/xai
DeepSeek@ai-sdk/deepseek
Groq@ai-sdk/groq

See the Vercel AI SDK providers page for the full list.

If you see warnings about @ai-sdk/provider, install it explicitly:

Terminal window
npm install @ai-sdk/provider

Authentication is handled by the underlying Vercel provider package. Refer to the specific provider’s documentation for credential setup - for example, @ai-sdk/openai reads OPENAI_API_KEY from the environment, and @ai-sdk/amazon-bedrock uses the standard AWS credential chain.