Configure the Reltio Data Pipeline for Databricks for GCP
Learn how to configure your GCP to receive data from your Reltio tenant.
Ready to move your data with your Reltio Data Pipeline for Databricks? Configure the pipeline to keep your Delta Lake tables and views in sync with your Reltio data model.
- Configure Databricks pipeline for GCP using Console UI - simpler UI-based configuration with automated steps.
- Configure Databricks pipeline for GCP using APIs (check back later for details) - API-based configuration with many manual steps.
Before you start
Before you start configuring the Reltio for Reltio Data Pipeline for Databricks, ensure you have the necessary permissions and information at hand. You may find it helpful to consult this page for easy reference.
Prerequisite | Required information | Your details |
---|---|---|
Configure Google cloud storage for Databricks | ||
The service requires the object storage to be publicly accessible over the internet. | ||
GCP storage account management permissions | You are a GCP administrator OR Ask your GCP administrator to perform these tasks | |
Integrate GCP cloud storage with Databricks | ||
GCP storage account management permissions | You are an GCP administrator OR Ask your GCP administrator to perform these tasks | |
Databricks account administrator permissions | You've been assigned these roles:
OR You've been assigned a role that contains these roles OR Ask your Databricks administrator to perform these tasks | |
Databricks Unity Catalog (when used) |
| |
Configure the Reltio Data Pipeline for Databricks | ||
Reltio tenant | Tenant Environment Name | |
Tenant ID | ||
Support request | Reltio Data Pipeline configuration request for Databricks | |
Validate and sync with the Reltio Data Pipeline for Databricks for GCP | ||
Reltio administrator permissions | You have one of these roles:
OR Ask your Reltio administrator to perform these tasks. |