Unify and manage your data

Store secrets in Data Pipeline Hub (AWS)

Learn how to store the secrets in the Data Pipeline Hub for AWS.

Before you store secrets, you need to have confirmation from Reltio that your tenant is provisioned with the Data Pipeline for Databricks.
The Data Pipeline Hub (DPH) needs to access the Staging container and the Databricks instance.
To store the secrets in DPH:
Send the secrets to the DPH Delta Lake adapter secrets endpoint:
PUT {{hub-url}}/api/tenants/{{tenantID}}/adapters/{{adapterName}}/secrets
Where hub-url is of the form <env>-data-pipeline-hub.retio.com and adapterName is the name of the pipeline instance (see corresponding tile name in Console Data Pipelines). The secrets are in this format:

{ "AWS":
    {  "awsAccessKey": "<Access Key ID>",
       "awsSecretKey": "<Secret Access Key>"
    },
  "Databricks":
    {  "token":"<serviceprincipalstoken>"
    }
}
Note: awsAccessKey and awsSecretKey are not required when you're using a role for authentication, to access the Staging bucket.

For more details, see the API reference.

After your tenant has been provisioned, you're ready to Validate and sync Reltio Data Pipeline for Databricks with AWS.