Accelerate the Value of Data

Configure a new GBQ pipeline in the Console

Learn how to use the new UI to configure a pipeline.

Before you navigate to the Console Cloud data warehouse pipeline application, view the prerequisites for configuring the GBQ pipeline in topic Configure the Reltio GBQ Connector.

Before using the Reltio Data Pipeline on the Google BigQuery (GBQ) cloud platform, you must configure it in Reltio.

This topic describes how to configure this data pipeline using the Console cloud data warehouse pipelines application. If you prefer to configure this data pipeline using Reltio REST APIs, see topic Configure the Reltio GBQ Connector.
To configure a new GBQ pipeline:
  1. In the Console, select Cloud data warehouse pipeline. The Cloud data warehouse pipeline page lists any pipelines already configured.
  2. Select + NEW PIPELINE and from the displayed options select Google BigQuery.
    The Configure Google BigQuery Pipeline page is displayed.

  3. In the Name field, enter the name of the pipeline, which is used to refer to the pipeline in the UI. The name must be alphanumeric, can be 3 to 20 characters in length, and should not contain any spaces.
  4. In the Authentication section, select Upload Service Account Key File to browse and upload your key file for your service account.
    Note: Ensure the required permissions have been assigned to your service account and configure your GCP credentials. For more information, see topic Grant the roles permissions to Reltio’s or your GCP service account.
  5. Under the Project and Dataset details section, in the GCP project name field, enter the name of the GCP project, which provides permissions and dataset roles.
    1. In the GCP dataset name field, enter the dataset name for the GCP project. This dataset allows the connector to deliver events from your Reltio tenant.
    2. In the Location field, enter the location of the dataset.
  6. Under the Data delivery options section,
    1. In the View Type field, select the type of view - Standard or Legacy - you want to create for the data.
    2. In the Storage format field, select the type of storage format.
      Select JSON if you want to store values into a single column, which is recommended. Select Columns if you want to store values in separate columns.
      Note: If you have selected the View Type as Legacy, the Storage format defaults to Columns. The JSON format is not available for selection.
    3. Select Transmit OV values only if you want to sync only operational values to the GBQ.
    4. Select Split Table if you want to split entity, relation, and interaction in separate GBQ tables based on the type.
    5. Select Flatten Attribute if you want to suppress ID and isOV fields for entities and relations. By doing this, only values of attributes will be stored in the GBQ.
    6. From the Attribute Fields dropdown, select the attributes that you want to store in the GBQ. This field is disabled if you have selected the Flatten Attribute checkbox.
  7. Select Save & Create Tables/Views to save the GBQ configuration. The new configuration is saved and displayed in the Cloud data warehouses pipelines page.
  8. Select the More options (the three dots) menu.
  9. Select Recreate tables/views. You will see the following dialog:
    1. From the Types dropdown, select the option that you want to recreate - tables and/or views.
    2. From the Specific field dropdown, select the field based on which you want to recreate tables and views.
    3. Select the Force recreate the tables checkbox to create tables based on your new selection. If you do not select this checkbox, tables and views are created only if they do not exist.
    4. Click Recreate to recreate the tables and views.
  10. Select Re-sync data. You will see the following dialog:
    1. From the Data types dropdown, select the data types that you want to re-sync.
    2. Select Re-sync to re-sync your data.
  11. Select Delete. You will see the following message dialog:
    1. Select Delete. The pipeline is deleted from the Cloud data warehouses pipelines page.