Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion SITE-ARCHITECTURE.md
Original file line number Diff line number Diff line change
Expand Up @@ -510,7 +510,7 @@ Namespaces
└── telemetry
Classes
├── Agent
├── BedrockModel
├── ConverseModel
└── Tool
Interfaces
├── AgentConfig
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -215,7 +215,7 @@ const myCustomTool = strands.tool({
})

const agent = new strands.Agent({
model: new strands.BedrockModel({
model: new strands.ConverseModel({
region: 'ap-southeast-2',
}),
tools: [calculatorTool, myCustomTool], // Add your tool here
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ const calculatorTool = strands.tool({

// Configure the agent with Amazon Bedrock
const agent = new strands.Agent({
model: new strands.BedrockModel({
model: new strands.ConverseModel({
region: 'ap-southeast-2', // Change to your preferred region
}),
tools: [calculatorTool],
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -188,7 +188,7 @@ response = agent("Tell me about Amazon Bedrock.")
</Tab>
<Tab label="TypeScript">

The [`BedrockModel`](@api/typescript/BedrockModel) provider is used by default when creating a basic Agent, and uses the [Claude Sonnet 4.5](https://aws.amazon.com/blogs/aws/introducing-claude-sonnet-4-5-in-amazon-bedrock-anthropics-most-intelligent-model-best-for-coding-and-complex-agents/) model by default. This basic example creates an agent using this default setup:
The [`ConverseModel`](@api/typescript/ConverseModel) provider is used by default when creating a basic Agent, and uses the [Claude Sonnet 4.5](https://aws.amazon.com/blogs/aws/introducing-claude-sonnet-4-5-in-amazon-bedrock-anthropics-most-intelligent-model-best-for-coding-and-complex-agents/) model by default. This basic example creates an agent using this default setup:

```typescript
--8<-- "user-guide/concepts/model-providers/amazon-bedrock_imports.ts:basic_default_imports"
Expand Down Expand Up @@ -235,7 +235,7 @@ response = agent("Tell me about Amazon Bedrock.")
</Tab>
<Tab label="TypeScript">

For more control over model configuration, you can create an instance of the [`BedrockModel`](@api/typescript/BedrockModel) class:
For more control over model configuration, you can create an instance of the [`ConverseModel`](@api/typescript/ConverseModel) class:

```typescript
--8<-- "user-guide/concepts/model-providers/amazon-bedrock.ts:basic_model_instance"
Expand Down Expand Up @@ -265,7 +265,7 @@ Common configuration parameters include:
</Tab>
<Tab label="TypeScript">

The [`BedrockModel`](@api/typescript/BedrockModelOptions) supports various configuration parameters. For a complete list of available options, see the [BedrockModelOptions API reference](@api/typescript/BedrockModelOptions).
The [`ConverseModel`](@api/typescript/ConverseModelOptions) supports various configuration parameters. For a complete list of available options, see the [ConverseModelOptions API reference](@api/typescript/ConverseModelOptions).

Common configuration parameters include:

Expand Down Expand Up @@ -485,7 +485,7 @@ guardrail_agent = Agent(model=bedrock_model)
response = guardrail_agent("Can you tell me about the Strands SDK?")
```

Amazon Bedrock supports guardrails to help ensure model outputs meet your requirements. Strands allows you to configure guardrails with your [`BedrockModel`](@api/typescript/BedrockModel).
Amazon Bedrock supports guardrails to help ensure model outputs meet your requirements. Strands allows you to configure guardrails with your [`ConverseModel`](@api/typescript/ConverseModel).

When a guardrail is triggered:

Expand All @@ -499,7 +499,7 @@ When `guardrail_latest_message=True`, only the most recent user message is sent
</Tab>
<Tab label="TypeScript">

Amazon Bedrock supports guardrails to help ensure model outputs meet your requirements. Strands allows you to configure guardrails with your [`BedrockModel`](@api/typescript/BedrockModel):
Amazon Bedrock supports guardrails to help ensure model outputs meet your requirements. Strands allows you to configure guardrails with your [`ConverseModel`](@api/typescript/ConverseModel):

```typescript
--8<-- "user-guide/concepts/model-providers/amazon-bedrock.ts:guardrails"
Expand Down Expand Up @@ -857,7 +857,7 @@ response = agent("If a train travels at 120 km/h and needs to cover 450 km, how
</Tab>
<Tab label="TypeScript">

Strands allows you to enable and configure reasoning capabilities with your [`BedrockModel`](@api/typescript/BedrockModel):
Strands allows you to enable and configure reasoning capabilities with your [`ConverseModel`](@api/typescript/ConverseModel):

```typescript
--8<-- "user-guide/concepts/model-providers/amazon-bedrock.ts:reasoning"
Expand Down
Original file line number Diff line number Diff line change
@@ -1,11 +1,11 @@
/**
* TypeScript examples for Amazon Bedrock model provider documentation.
* These examples demonstrate common usage patterns for the BedrockModel.
* These examples demonstrate common usage patterns for the ConverseModel.
*/
// @ts-nocheck
// Imports are in amazon-bedrock_imports.ts

import { Agent, BedrockModel, DocumentBlock, CachePointBlock, Message } from '@strands-agents/sdk'
import { Agent, ConverseModel, DocumentBlock, CachePointBlock, Message } from '@strands-agents/sdk'

// Basic usage examples
async function basicUsageDefault() {
Expand All @@ -28,13 +28,13 @@ async function basicUsageModelId() {
async function basicUsageModelInstance() {
// --8<-- [start:basic_model_instance]
// Create a Bedrock model instance
const bedrockModel = new BedrockModel({
const bedrockModel = new ConverseModel({
modelId: 'us.amazon.nova-premier-v1:0',
temperature: 0.3,
topP: 0.8,
})

// Create an agent using the BedrockModel instance
// Create an agent using the ConverseModel instance
const agent = new Agent({ model: bedrockModel })

// Use the agent
Expand All @@ -46,7 +46,7 @@ async function basicUsageModelInstance() {
async function configurationExample() {
// --8<-- [start:configuration]
// Create a configured Bedrock model
const bedrockModel = new BedrockModel({
const bedrockModel = new ConverseModel({
modelId: 'anthropic.claude-sonnet-4-20250514-v1:0',
region: 'us-east-1', // Specify a different region than the default
temperature: 0.3,
Expand All @@ -70,13 +70,13 @@ async function configurationExample() {
async function streamingExample() {
// --8<-- [start:streaming]
// Streaming model (default)
const streamingModel = new BedrockModel({
const streamingModel = new ConverseModel({
modelId: 'anthropic.claude-sonnet-4-20250514-v1:0',
stream: true, // This is the default
})

// Non-streaming model
const nonStreamingModel = new BedrockModel({
const nonStreamingModel = new ConverseModel({
modelId: 'us.meta.llama3-2-90b-instruct-v1:0',
stream: false, // Disable streaming
})
Expand All @@ -87,7 +87,7 @@ async function streamingExample() {
async function updateConfiguration() {
// --8<-- [start:update_config]
// Create the model with initial configuration
const bedrockModel = new BedrockModel({
const bedrockModel = new ConverseModel({
modelId: 'anthropic.claude-sonnet-4-20250514-v1:0',
temperature: 0.7,
})
Expand Down Expand Up @@ -120,7 +120,7 @@ async function toolBasedConfigUpdate() {
})

const agent = new Agent({
model: new BedrockModel({ modelId: 'anthropic.claude-sonnet-4-20250514-v1:0' }),
model: new ConverseModel({ modelId: 'anthropic.claude-sonnet-4-20250514-v1:0' }),
tools: [updateTemperature],
})
// --8<-- [end:tool_update_config]
Expand All @@ -130,7 +130,7 @@ async function toolBasedConfigUpdate() {
async function reasoningSupport() {
// --8<-- [start:reasoning]
// Create a Bedrock model with reasoning configuration
const bedrockModel = new BedrockModel({
const bedrockModel = new ConverseModel({
modelId: 'anthropic.claude-sonnet-4-20250514-v1:0',
additionalRequestFields: {
thinking: {
Expand All @@ -157,7 +157,7 @@ async function customCredentials() {
// See AWS SDK for JavaScript documentation for all credential options:
// https://docs.aws.amazon.com/sdk-for-javascript/v3/developer-guide/setting-credentials-node.html

const bedrockModel = new BedrockModel({
const bedrockModel = new ConverseModel({
modelId: 'anthropic.claude-sonnet-4-20250514-v1:0',
region: 'us-west-2',
clientConfig: {
Expand All @@ -174,7 +174,7 @@ async function customCredentials() {
// Multimodal support
async function multimodalSupport() {
// --8<-- [start:multimodal_full]
const bedrockModel = new BedrockModel({
const bedrockModel = new ConverseModel({
modelId: 'anthropic.claude-sonnet-4-20250514-v1:0',
})

Expand All @@ -197,7 +197,7 @@ async function multimodalSupport() {
// S3 location support for multimodal content
async function s3LocationSupport() {
// --8<-- [start:s3_location]
const agent = new Agent({ model: new BedrockModel() })
const agent = new Agent({ model: new ConverseModel() })

const response = await agent.invoke([
new DocumentBlock({
Expand Down Expand Up @@ -256,7 +256,7 @@ async function systemPromptCachingFull() {
// Tool caching
async function toolCachingFull() {
// --8<-- [start:tool_caching_full]
const bedrockModel = new BedrockModel({
const bedrockModel = new ConverseModel({
modelId: 'anthropic.claude-sonnet-4-20250514-v1:0',
cacheConfig: { strategy: 'auto' },
})
Expand Down Expand Up @@ -294,7 +294,7 @@ async function toolCachingFull() {
// Automatic cache strategy for messages
async function automaticCacheStrategy() {
// --8<-- [start:automatic_cache_strategy]
const bedrockModel = new BedrockModel({
const bedrockModel = new ConverseModel({
modelId: 'us.anthropic.claude-sonnet-4-5-20250929-v1:0',
cacheConfig: { strategy: 'auto' },
})
Expand Down Expand Up @@ -395,8 +395,8 @@ async function cacheMetrics() {
// Guardrails configuration
async function guardrailsExample() {
// --8<-- [start:guardrails]
// Using guardrails with BedrockModel
const bedrockModel = new BedrockModel({
// Using guardrails with ConverseModel
const bedrockModel = new ConverseModel({
modelId: 'anthropic.claude-sonnet-4-20250514-v1:0',
guardrailConfig: {
guardrailIdentifier: 'your-guardrail-id',
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -10,5 +10,5 @@ import { z } from 'zod'
// --8<-- [end:tool_update_config_imports]

// --8<-- [start:custom_credentials_imports]
import { BedrockModel } from '@strands-agents/sdk/bedrock'
import { ConverseModel } from '@strands-agents/sdk/models/bedrock'
// --8<-- [end:custom_credentials_imports]
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
* These examples demonstrate how to implement a custom model provider.
*/

import { Agent, BedrockModel, type BedrockModelConfig } from '@strands-agents/sdk'
import { Agent, ConverseModel, type ConverseModelConfig } from '@strands-agents/sdk'
import type {
Model,
BaseModelConfig,
Expand All @@ -16,9 +16,9 @@ import type {
ModelMessageStopEventData,
} from '@strands-agents/sdk'

// Example wrapper around BedrockModel for demonstration
class YourCustomModel extends BedrockModel {
constructor(config: BedrockModelConfig = {
// Example wrapper around ConverseModel for demonstration
class YourCustomModel extends ConverseModel {
constructor(config: ConverseModelConfig = {
modelId: 'anthropic.claude-3-5-sonnet-20241022-v2:0'
}) {
super(config)
Expand Down
12 changes: 6 additions & 6 deletions src/content/docs/user-guide/concepts/model-providers/gemini.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -62,9 +62,9 @@ print(response)

```typescript
import { Agent } from '@strands-agents/sdk'
import { GeminiModel } from '@strands-agents/sdk/gemini'
import { GenAIModel } from '@strands-agents/sdk/models/google'

const model = new GeminiModel({
const model = new GenAIModel({
apiKey: '<KEY>',
modelId: 'gemini-2.5-flash',
params: {
Expand Down Expand Up @@ -333,11 +333,11 @@ print(response)
```typescript
import { GoogleGenAI } from '@google/genai'
import { Agent } from '@strands-agents/sdk'
import { GeminiModel } from '@strands-agents/sdk/gemini'
import { GenAIModel } from '@strands-agents/sdk/models/google'

const client = new GoogleGenAI({ apiKey: '<KEY>' })

const model = new GeminiModel({
const model = new GenAIModel({
client,
modelId: 'gemini-2.5-flash',
params: {
Expand Down Expand Up @@ -396,9 +396,9 @@ response = agent([

```typescript
import { Agent, ImageBlock, TextBlock } from '@strands-agents/sdk'
import { GeminiModel } from '@strands-agents/sdk/gemini'
import { GenAIModel } from '@strands-agents/sdk/models/google'

const model = new GeminiModel({
const model = new GenAIModel({
apiKey: '<KEY>',
modelId: 'gemini-2.5-flash',
})
Expand Down
8 changes: 4 additions & 4 deletions src/content/docs/user-guide/concepts/model-providers/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -6,20 +6,20 @@
// Imports are in index_imports.ts

import { Agent } from '@strands-agents/sdk'
import { BedrockModel } from '@strands-agents/sdk/models/bedrock'
import { OpenAIModel } from '@strands-agents/sdk/models/openai'
import { ConverseModel } from '@strands-agents/sdk/models/bedrock'
import { ChatModel } from '@strands-agents/sdk/models/openai'

async function basicUsage() {
// --8<-- [start:basic_usage]
// Use Bedrock
const bedrockModel = new BedrockModel({
const bedrockModel = new ConverseModel({
modelId: 'anthropic.claude-sonnet-4-20250514-v1:0',
})
let agent = new Agent({ model: bedrockModel })
let response = await agent.invoke('What can you help me with?')

// Alternatively, use OpenAI by just switching model provider
const openaiModel = new OpenAIModel({
const openaiModel = new ChatModel({
apiKey: process.env.OPENAI_API_KEY,
modelId: 'gpt-4o',
})
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,6 @@

// --8<-- [start:basic_usage_imports]
import { Agent } from '@strands-agents/sdk'
import { BedrockModel } from '@strands-agents/sdk/bedrock'
import { OpenAIModel } from '@strands-agents/sdk/openai'
import { ConverseModel } from '@strands-agents/sdk/models/bedrock'
import { ChatModel } from '@strands-agents/sdk/models/openai'
// --8<-- [end:basic_usage_imports]
12 changes: 6 additions & 6 deletions src/content/docs/user-guide/concepts/model-providers/openai.ts
Original file line number Diff line number Diff line change
@@ -1,17 +1,17 @@
/**
* TypeScript examples for OpenAI model provider documentation.
* These examples demonstrate common usage patterns for the OpenAIModel.
* These examples demonstrate common usage patterns for the ChatModel.
*/
// @ts-nocheck
// Imports are in openai_imports.ts

import { Agent } from '@strands-agents/sdk'
import { OpenAIModel } from '@strands-agents/sdk/openai'
import { ChatModel } from '@strands-agents/sdk/models/openai'

// Basic usage
async function basicUsage() {
// --8<-- [start:basic_usage]
const model = new OpenAIModel({
const model = new ChatModel({
apiKey: process.env.OPENAI_API_KEY || '<KEY>',
modelId: 'gpt-4o',
maxTokens: 1000,
Expand All @@ -27,7 +27,7 @@ async function basicUsage() {
// Custom server
async function customServer() {
// --8<-- [start:custom_server]
const model = new OpenAIModel({
const model = new ChatModel({
apiKey: '<KEY>',
clientConfig: {
baseURL: '<URL>',
Expand All @@ -43,7 +43,7 @@ async function customServer() {
// Configuration
async function customConfig() {
// --8<-- [start:custom_config]
const model = new OpenAIModel({
const model = new ChatModel({
apiKey: process.env.OPENAI_API_KEY || '<KEY>',
modelId: 'gpt-4o',
maxTokens: 1000,
Expand All @@ -62,7 +62,7 @@ async function customConfig() {
// Update configuration
async function updateConfig() {
// --8<-- [start:update_config]
const model = new OpenAIModel({
const model = new ChatModel({
apiKey: process.env.OPENAI_API_KEY || '<KEY>',
modelId: 'gpt-4o',
temperature: 0.7,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2,5 +2,5 @@

// --8<-- [start:basic_usage_imports]
import { Agent } from '@strands-agents/sdk'
import { OpenAIModel } from '@strands-agents/sdk/openai'
import { ChatModel } from '@strands-agents/sdk/models/openai'
// --8<-- [end:basic_usage_imports]
Original file line number Diff line number Diff line change
Expand Up @@ -119,7 +119,7 @@ const calculatorTool = strands.tool({

// Configure the agent with Amazon Bedrock
const agent = new strands.Agent({
model: new strands.BedrockModel({
model: new strands.ConverseModel({
region: 'ap-southeast-2', // Change to your preferred region
}),
tools: [calculatorTool],
Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
// --8<-- [start: imports]
import { Agent } from '@strands-agents/sdk'
import express, { type Request, type Response } from 'express'
import { OpenAIModel } from '@strands-agents/sdk/openai'
import { ChatModel } from '@strands-agents/sdk/models/openai'

// --8<-- [end: imports]
Loading
Loading