Unify and manage your data

Configure the Reltio Data Pipeline for Databricks for Azure

Learn how to configure Databricks to receive data from your Reltio tenant in the Azure cloud.

Ready to move your data with your Reltio Data Pipeline for Databricks? Configure the pipeline to keep your Delta Lake tables and views in sync with your Reltio data model.

You may:

Before you start

Before you start configuring the Reltio for Reltio Data Pipeline for Databricks, ensure you have the necessary permissions and information at hand. You may find it helpful to consult this page for easy reference.

Table 1. Prerequisites for configuring Reltio Data Pipeline for Databricks for Azure
PrerequisiteRequired informationYour details
Configure Azure cloud storage for Databricks
The service requires the object storage to be publicly accessible over the internet.
Azure storage account management permissionsYou are an Azure administrator

OR

Ask your Azure administrator to perform these tasks

Integrate Azure cloud storage with Databricks
Azure storage account management permissionsYou are an Azure administrator

OR

Ask your Azure administrator to perform these tasks

Databricks account administrator permissionsYou've been assigned these roles:
  • Workspace admin to manage permissions and tokens

OR

You've been assigned a role that contains these roles

OR

Ask your Databricks administrator to perform these tasks

Databricks Unity Catalog (when used)
  • One catalog (to create schema)
  • Service principal token must have permissions at catalog level to create the schema via the pipeline.
Configure the Reltio Data Pipeline for Databricks
Reltio tenantTenant Environment Name
Tenant ID
Support requestReltio Data Pipeline configuration request for Databricks
Validate and sync with the Reltio Data Pipeline for Databricks for Azure
Reltio administrator permissionsYou have one of these roles:
  • Reltio Customer Administrator
  • Reltio Tenant Administrator
  • Reltio user with the role ROLE_DATA_PIPELINE_ADMIN

OR

Ask your Reltio administrator to perform these tasks.

Take note

As you work through the configuration, you'll want to make a note of some values you'll need in later steps and stages. You may find it helpful to download and make a copy of this page and record your information as you go along.

Table 2. Information needed while configuring Reltio Data Pipeline for Databricks for Azure
Stage/sectionEntry fieldYour details
Determine mode for running pipelineMode of Delta Live Tables pipeline
Configure Azure cloud storage for Databricks
Create an Azure resource groupResource group
Create an Azure storage accountStorage account name
Create an Azure storage account container for Staging with a lifecycle rule.Staging Storage container name
Create an Azure storage account container for Target Target Storage container name
Get the Azure subscriptionId under which all the storage accounts are createdsubscriptionId
Configure Azure Event Notification for Staging Container [Only Required if using File Notification Mode]
Create an Azure storage account queueStorage queue URL
Create an Azure event grid subscriptionEvent subscription name
Permission setup for Data Pipeline Hub service
If auth method is Client Credentials:Application display name
Application (client) ID
Directory (tenant) ID
Client secret
If auth method is SAS Token:SAS Tokens
If auth method is Keys:Storage Account Key
Create Azure access controls for StagingStaging Custom role name
Permission setup for Databricks
Databricks host URLDatabricks URL
Create secret scope and add keys to DatabricksSecrets Scope Name
Create secret scope and add keys to DatabricksSecrets Key Name for Staging Container
Create secret scope and add keys to DatabricksSecrets Key Name for Target Container
Manage service principals, a Databricks guideService Principal Token
Create Azure access controls for Target containerTarget Custom role name