Accelerate the Value of Data

Configure the Reltio Data Pipeline for Databricks for GCP

Learn how to configure your GCP to receive data from your Reltio tenant.

Ready to move your data with your Reltio Data Pipeline for Databricks? Configure the pipeline to keep your Delta Lake tables and views in sync with your Reltio data model.

You may:

Before you start

Before you start configuring the Reltio for Reltio Data Pipeline for Databricks, ensure you have the necessary permissions and information at hand. You may find it helpful to consult this page for easy reference.

Table 1. Prerequisites for configuring Reltio Data Pipeline for Databricks for GCP
PrerequisiteRequired informationYour details
Configure Google cloud storage for Databricks
The service requires the object storage to be publicly accessible over the internet.
GCP storage account management permissionsYou are a GCP administrator

OR

Ask your GCP administrator to perform these tasks

Integrate GCP cloud storage with Databricks
GCP storage account management permissionsYou are an GCP administrator

OR

Ask your GCP administrator to perform these tasks

Databricks account administrator permissionsYou've been assigned these roles:
  • Workspace admin to manage permissions and tokens

OR

You've been assigned a role that contains these roles

OR

Ask your Databricks administrator to perform these tasks

Databricks Unity Catalog (when used)
  • One catalog (to create schema)
  • Service principal token must have permissions at catalog level to create the schema via the pipeline.
Configure the Reltio Data Pipeline for Databricks
Reltio tenantTenant Environment Name
Tenant ID
Support requestReltio Data Pipeline configuration request for Databricks
Validate and sync with the Reltio Data Pipeline for Databricks for GCP
Reltio administrator permissionsYou have one of these roles:
  • Reltio Customer Administrator
  • Reltio Tenant Administrator
  • Reltio user with the role ROLE_DATA_PIPELINE_ADMIN

OR

Ask your Reltio administrator to perform these tasks.