Skip to content

BedrockModel

Defined in: src/models/bedrock.ts:239

AWS Bedrock model provider implementation.

Implements the Model interface for AWS Bedrock using the Converse Stream API. Supports streaming responses, tool use, prompt caching, and comprehensive error handling.

const provider = new BedrockModel({
modelConfig: {
modelId: 'global.anthropic.claude-sonnet-4-5-20250929-v1:0',
maxTokens: 1024,
temperature: 0.7
},
clientConfig: {
region: 'us-west-2'
}
})
const messages: Message[] = [
{ type: 'message', role: 'user', content: [{ type: 'textBlock', text: 'Hello!' }] }
]
for await (const event of provider.stream(messages)) {
if (event.type === 'modelContentBlockDeltaEvent' && event.delta.type === 'textDelta') {
process.stdout.write(event.delta.text)
}
}
new BedrockModel(options?): BedrockModel;

Defined in: src/models/bedrock.ts:273

Creates a new BedrockModel instance.

ParameterTypeDescription
options?BedrockModelOptionsOptional configuration for model and client

BedrockModel

// Minimal configuration with defaults
const provider = new BedrockModel({
region: 'us-west-2'
})
// With model configuration
const provider = new BedrockModel({
region: 'us-west-2',
modelId: 'global.anthropic.claude-sonnet-4-5-20250929-v1:0',
maxTokens: 2048,
temperature: 0.8,
cachePrompt: 'ephemeral'
})
// With client configuration
const provider = new BedrockModel({
region: 'us-east-1',
clientConfig: {
credentials: myCredentials
}
})

Model.constructor

updateConfig(modelConfig): void;

Defined in: src/models/bedrock.ts:319

Updates the model configuration. Merges the provided configuration with existing settings.

ParameterTypeDescription
modelConfigBedrockModelConfigConfiguration object with model-specific settings to update

void

// Update temperature and maxTokens
provider.updateConfig({
temperature: 0.9,
maxTokens: 2048
})

Model.updateConfig


getConfig(): BedrockModelConfig;

Defined in: src/models/bedrock.ts:334

Retrieves the current model configuration.

BedrockModelConfig

The current configuration object

const config = provider.getConfig()
console.log(config.modelId)

Model.getConfig


stream(messages, options?): AsyncIterable<ModelStreamEvent>;

Defined in: src/models/bedrock.ts:367

Streams a conversation with the Bedrock model. Returns an async iterable that yields streaming events as they occur.

ParameterTypeDescription
messagesMessage[]Array of conversation messages
options?StreamOptionsOptional streaming configuration

AsyncIterable<ModelStreamEvent>

Async iterable of streaming events

{ContextWindowOverflowError} When input exceeds the model’s context window

{ModelThrottledError} When Bedrock service throttles requests

const messages: Message[] = [
{ type: 'message', role: $1, content: [{ type: 'textBlock', text: 'What is 2+2?' }] }
]
const options: StreamOptions = {
systemPrompt: 'You are a helpful math assistant.',
toolSpecs: [calculatorTool]
}
for await (const event of provider.stream(messages, options)) {
if (event.type === 'modelContentBlockDeltaEvent') {
console.log(event.delta)
}
}

Model.stream


streamAggregated(messages, options?): AsyncGenerator<
| ModelStreamEvent
| ContentBlock, StreamAggregatedResult, undefined>;

Defined in: src/models/model.ts:188

Streams a conversation with aggregated content blocks and messages. Returns an async generator that yields streaming events and content blocks, and returns the final message with stop reason and optional metadata.

This method enhances the basic stream() by collecting streaming events into complete ContentBlock and Message objects, which are needed by the agentic loop for tool execution and conversation management.

The method yields:

  • ModelStreamEvent - Original streaming events (passed through)
  • ContentBlock - Complete content block (emitted when block completes)

The method returns:

  • StreamAggregatedResult containing the complete message, stop reason, and optional metadata

All exceptions thrown from this method are wrapped in ModelError to provide a consistent error type for model-related errors. Specific error subtypes like ContextWindowOverflowError, ModelThrottledError, and MaxTokensError are preserved.

ParameterTypeDescription
messagesMessage[]Array of conversation messages
options?StreamOptionsOptional streaming configuration

AsyncGenerator< | ModelStreamEvent | ContentBlock, StreamAggregatedResult, undefined>

Async generator yielding ModelStreamEvent | ContentBlock and returning a StreamAggregatedResult

ModelError - Base class for all model-related errors

ContextWindowOverflowError - When input exceeds the model’s context window

ModelThrottledError - When the model provider throttles requests

MaxTokensError - When the model reaches its maximum token limit

Model.streamAggregated