Skip to main content
The Anthropic caller uses the Anthropic Go SDK with streaming via onStream callbacks. Note that Claude requires max_tokens to be set explicitly. Provider directory: api/integration-api/internal/caller/anthropic/

Vault Credential

KeyDescription
keyAnthropic API key from console.anthropic.com

Setup

1

Get an Anthropic API key

Sign in at console.anthropic.comAPI Keys → Create Key.
2

Add to Rapida vault

In the Rapida dashboard → Credentials → Create Credential, select provider Anthropic, enter key = your API key.
3

Configure the assistant LLM

In the assistant settings → LLM Provider, select Anthropic and set model parameters:
{
  "model.name": "claude-sonnet-4-6",
  "model.max_tokens": 1024,
  "model.temperature": 0.7
}
model.max_tokens is required for Anthropic. The API will return an error if it is not set.

Supported Models

ModelContextNotes
claude-opus-4-6200kMost capable
claude-sonnet-4-6200kBest speed/capability balance (recommended)
claude-haiku-4-5-20251001200kFastest, lowest cost

Model Parameters

KeySupportedNotes
model.nameRequired
model.max_tokensRequired — no default
model.temperature0.0–1.0
model.top_p
model.top_kAnthropic-specific
model.stopArray of stop sequences
model.thinkingExtended thinking mode