Bring Your Own LLM for AgentFlow™ at a glance
Learn about how Bring Your Own LLM (BYO LLM) lets your organization use its own LLM endpoints with Reltio AgentFlow instead of the default Reltio-managed LLM.
Bring Your Own LLM (BYO LLM) allows your organization to use its own Large Language Model endpoints with AgentFlow instead of the default Reltio-managed LLM. When BYO LLM is enabled for your tenant, all agent conversations are routed through your organization's LLM infrastructure, giving you full control over model selection, data residency, and compliance.
BYO LLM is configured by the Reltio team on your behalf. You request the setup, provide the necessary credentials, and the Reltio team handles configuration, validation, and activation. Once enabled, System Administrators can view the LLM configuration, and Data Stewards and Data Product Owners can select from your organization's available models when starting a conversation.
How it works
When BYO LLM is enabled, all agent LLM calls route exclusively through your organization's endpoints. Reltio-managed models are not used.
Setup is initiated through a Reltio support request. After you provide credentials and endpoint details, the Reltio team configures, validates, and activates the integration. Once active, System Administrators can view the LLM configurations in AgentFlow, and Data Stewards and Data Product Owners can select from the available models when starting a new conversation.
Key behaviors
The following behaviors apply when BYO LLM is enabled for your tenant:
- No data flows to Reltio-managed LLMs: All agent LLM calls are routed exclusively to your organization's endpoints.
- The admin view is read-only: System Administrators can see the current LLM configurations and their status, but cannot modify them.
- Model selection is per-conversation: Data Stewards and Data Product Owners select an LLM at the start of each new conversation. Once the first message is sent, the selection is locked for the lifetime of that conversation.
Who uses this feature
| Role | What this covers |
|---|---|
| How to request BYO LLM setup, what credentials to provide, and how to view LLM configurations. For more information, see Request BYO LLM setup and view your LLM configuration. | |
| How to select a model when starting a conversation with an agent. For more information, see Select a model in a conversation. |