Skip to main content

Overview

The integration-api is the provider execution layer. It sits between assistant-api and every external AI provider — OpenAI, Anthropic, Deepgram, ElevenLabs, and others. It stores all provider credentials encrypted at rest and is the only service in the platform that ever holds or transmits plaintext API keys.

Port

9004 — HTTP · gRPC (cmux)

Language

Go 1.25 Gin (REST) + gRPC

Storage

PostgreSQL integration_db Redis (provider cache)
The integration-api is the only service that decrypts and uses provider API keys. Keys are decrypted in-memory per request and never written to logs, forwarded to other services, or stored in plaintext anywhere on disk.

Components

Each external provider is implemented as a Go package under api/integration-api/internal/caller/<provider>/. Every package follows a consistent structure:
FilePurpose
<provider>.goClient initialization, credential binding
llm.goLLM caller — streaming token inference
embedding.goEmbedding model invocation (where supported)
verify-credential.goPre-storage credential validation
The caller/callers.go factory registers all providers and routes execution to the correct implementation based on the credential type stored in integration_db.Adding a new LLM providerCreate api/integration-api/internal/caller/<provider>/ with the above files, then register in callers.go. No changes to other services are needed.
Credentials follow a strict encrypt-on-write, decrypt-on-use lifecycle:
For providers that use OAuth (e.g., Google, GitHub), integration-api manages the full OAuth flow: redirect, callback, token storage, and automatic refresh.
OAuth SettingVariable
Callback URLOAUTH_CALLBACK_URL
Google clientGOOGLE_OAUTH_CLIENT_ID, GOOGLE_OAUTH_CLIENT_SECRET
GitHub clientGITHUB_OAUTH_CLIENT_ID, GITHUB_OAUTH_CLIENT_SECRET

Supported Providers

ProviderNotes
OpenAIGPT-4o, GPT-4, GPT-3.5 · Function calling · Streaming
AnthropicClaude 3.5 Sonnet, Claude 3 · Tool use · Streaming
Google GeminiGemini Pro · Flash · Streaming
Google Vertex AIEnterprise Gemini deployment
Azure OpenAIEnterprise GPT deployment with custom endpoint
AWS BedrockLlama, Titan, Mistral via AWS
CohereCommand R+ · Streaming
MistralMistral Large · Small · Streaming
HuggingFaceInference API
ReplicateModel hosting via Replicate API
VoyageAIEmbeddings and reranking

Configuration

Edit docker/integration-api/.integration.env before starting the service.

Required variables

VariableRequiredDefaultDescription
SECRET✅ Yesrpd_pksJWT signing secret — must match all services
POSTGRES__HOST✅ YespostgresPostgreSQL host
POSTGRES__DB_NAME✅ Yesintegration_dbDatabase name
POSTGRES__AUTH__USER✅ Yesrapida_userDatabase user
POSTGRES__AUTH__PASSWORD✅ YesDatabase password
REDIS__HOST✅ YesredisRedis host
INTEGRATION_CRYPTO_KEY✅ YesAES-256-GCM key for credential encryption
WEB_HOST✅ Yesweb-api:9001web-api gRPC address

Optional OAuth variables

VariableRequiredDescription
OAUTH_CALLBACK_URLNoOAuth redirect URI
GOOGLE_OAUTH_CLIENT_IDNoGoogle OAuth app client ID
GOOGLE_OAUTH_CLIENT_SECRETNoGoogle OAuth app client secret
GITHUB_OAUTH_CLIENT_IDNoGitHub OAuth app client ID
GITHUB_OAUTH_CLIENT_SECRETNoGitHub OAuth app client secret

Full environment file

# ── Service identity ──────────────────────────────────────────────
SERVICE_NAME=integration-api
HOST=0.0.0.0
PORT=9004
LOG_LEVEL=debug
SECRET=rpd_pks
ENV=development

# ── PostgreSQL ────────────────────────────────────────────────────
POSTGRES__HOST=postgres
POSTGRES__PORT=5432
POSTGRES__DB_NAME=integration_db
POSTGRES__AUTH__USER=rapida_user
POSTGRES__AUTH__PASSWORD=rapida_db_password
POSTGRES__MAX_OPEN_CONNECTION=10
POSTGRES__SSL_MODE=disable

# ── Redis ─────────────────────────────────────────────────────────
REDIS__HOST=redis
REDIS__PORT=6379

# ── Credential encryption ─────────────────────────────────────────
# Required: AES-256-GCM key for encrypting stored provider credentials
# Generate: openssl rand -hex 32
INTEGRATION_CRYPTO_KEY=your_32_char_encryption_key_here

# ── Internal service addresses ────────────────────────────────────
WEB_HOST=web-api:9001
INTEGRATION_CRYPTO_KEY protects all stored provider credentials. Store it in a secret manager (AWS Secrets Manager, HashiCorp Vault, Kubernetes Secrets) — never commit it to version control. If this key is rotated or lost, all stored credentials must be re-entered, as the ciphertext becomes unreadable.

Running

# Start integration-api and its dependencies
make up-integration

# Follow logs
make logs-integration

# Rebuild after code changes
make rebuild-integration

Health & Observability

EndpointPurpose
GET /readiness/Reports whether the service is ready (DB + Redis connected)
GET /healthz/Liveness probe
curl http://localhost:9004/readiness/

Troubleshooting

  • Verify the API key has the correct permissions for your account tier.
  • Check the provider’s status page for outages.
  • Confirm INTEGRATION_CRYPTO_KEY has not changed since the credential was stored.
  • Check make logs-integration for provider-side timeout errors.
  • Increase POSTGRES__MAX_OPEN_CONNECTION if database contention is visible.
  • For Azure OpenAI: confirm the deployment name in the credential matches the actual Azure deployment.
INTEGRATION_CRYPTO_KEY has changed between restarts. Credentials encrypted with the old key cannot be decrypted. Set the key back to its original value, or re-enter all provider credentials through the dashboard.

Next Steps