Store secrets in Data Pipeline Hub (Azure)
Learn how to store Azure authentication secrets for the Snowflake adapter using Data Pipeline Hub APIs.
To connect the Snowflake adapter to Azure storage securely, you must store the required credentials using the Data Pipeline Hub (DPH) API. This ensures your staging container is accessible to Snowflake without embedding secrets in configuration files. This topic shows how to store Azure authentication secrets for different supported auth methods.
Prerequisites
Before you begin:
- Your tenant must be provisioned with the Reltio Data Pipeline for Snowflake.
- You must have created and obtained the appropriate Azure credentials:
- Storage account key (for
KEYSauth) - SAS token (for
TOKENauth) - Client secret (for
CLIENT_CREDENTIALSauth)
- Storage account key (for
- You must have the following values ready:
- Tenant ID
- Adapter name (shown on the Console > Data Pipelines tile)
- DPH hub URL in the format:
<env>-data-pipeline-hub.reltio.com
Store Azure secrets using the DPH API
To store Azure secrets for the Snowflake adapter:
- Choose the authentication method to access the Azure Blob Storage container.
-
Use a PUT request to send the secrets to the DPH Snowflake adapter secrets endpoint:
PUT https://<hub-url>/api/tenants/<tenantId>/adapters/<adapterName>/secretsReplace the placeholders:
<hub-url>: Your DPH environment, such asus-east-data-pipeline-hub.reltio.com<tenantId>: Your Reltio tenant ID<adapterName>: The pipeline adapter name, such assnowflake,snowflake-prod.
-
Construct the JSON payload using the applicable credential fields:
{ "Azure": { "accountKey": "<Account Key>", "sasToken": "<SAS Token>", "clientSecret": "<Client Secret>" } }Only include fields that match your configured
authMethod. - Verify that the request returns a
200 OKresponse.
Result
The secrets are securely stored in the Reltio Secret Manager and used by the Snowflake adapter at runtime to authenticate with your Azure storage account.