LargeLanguageCaller interface and registering it in the router.
Directory Structure
Step 1 — Create the Provider Directory
Step 2 — Implement LargeLanguageCaller
credentials["key"] for the primary API key. Add additional keys (endpoint, region, etc.) as needed and document them in your vault credential.
Step 3 — Register in the Router / Factory
Openapi/integration-api/internal/caller/callers.go and add your provider to the factory switch:
Step 4 — Rebuild
Reference Implementations
| Provider | File | Pattern |
|---|---|---|
| OpenAI | caller/openai/llm.go | SSE streaming, tool calls, per-token onStream |
| Anthropic | caller/anthropic/llm.go | max_tokens required, thinking mode |
| Gemini | caller/gemini/llm.go | Google Gen AI SDK, content parts |
| Azure | caller/azure/llm.go | Same as OpenAI but custom endpoint |