Unify and manage your data

Load entities into a tenant

Learn about how to use the Data Loader to load entity data types into a tenant.

Let's cover how to load entity data types from a selected file into a tenant in the Reltio Data Cloud. For information about loading relationship and interaction data types, see topics Load relationships into a tenant and Load interactions into a tenant.

The Data Loader enables you to load entities from various data source locations. For information about the locations we support, see topic Supported data sources and file types.

To load entities into a tenant:
  1. Open your tenant in Data Loader.
    1. From the Applications menu, select Console.
    2. From the Tenant Management section, select the Data Loader application.
  2. In the Job Definition tab, select LOAD DATA and then Entities to create a new data load job.
  3. On the Entities data load page, in the Upload section, fill in the following details and then select Continue:
    Job definition Name: Enter the name of the job under Job definition Name.

    Entity type: Select the entity type corresponding to the data in the file that you’re loading. This selection must be based on the entity type of the data in the file to be used in the job. For example, Organizations, Products, Contacts, and so on.

    Select file: Specify the file to upload:

    1. Select the source:
      • My Computer (the default)
      • Amazon S3
      • Google Cloud Storage (GCP)
      • Microsoft Azure Blob Storage
      • SFTP
        Note: When you upload a file from the local file system, the file size must not exceed 50 MB.
    2. Specify the file or account details:

      Specify an upload file from My Computer:

      • Drop your file on the page

        or

        Select the file:
          1. Select file type:
          2. Select BROWSE FILE.
          3. Specify the file delimiter type separate columns/values for a CSV file:
            • Comma (,)
            • Semicolon (;)
            • Single pipe (|)
            • Double pipe (||)
          4. Select or clear the First row is a header checkbox for CSV or Excel files.
            Note: This determines whether mappings will be displayed with or without a header row. For more information, see Map File Columns to Attributes.

    Specify an upload file from Amazon S3:

    On clicking Amazon S3, the credentials that must be specified for the Amazon S3 source are displayed:

    Enter all the mandatory fields (*) for Amazon S3 Credentials.
    Table 1. Amazon S3 credentials
    CredentialsDescription
    Account NameThe Amazon S3 account name.
    Bucket NameName of the bucket or the folder where the file is located.
    Authenticate using key/secretIf enabled, this checkbox provides the user with the option to authenticate using stored credentials versus an AWS IAM role.

    To use this option, check the Authenticate using key/secret checkbox and specify the AWS Key and AWS Secret.

    The API Key and Secret enable you to authenticate with and access the S3 bucket.

    AWS KeyThe credentials for the AWS account root user. It is needed to fetch a file from the AWS S3 bucket, which is consumed by the Data Loader job. It’s alphanumeric. It defines the AWS key for the input file.
    AWS SecretThe AWS secret key or the AWS password. It’s alphanumeric. It defines the AWS secret for the input file.
    RoleTo authenticate using an AWS IAM role, instead of stored credentials, clear the Authenticate using key/secret checkbox and provide the required values for Role and External ID.

    We recommend that you use an AWS IAM Role. It's a more secure way to provide access to files in anS3 bucket.

    The value for Role has to be provided in a format similar to arn:aws:iam::*:role/reltio.client.dataloader.*

    For example:
    {
      "role": "arn:aws:iam::634947810771:role/reltio.client.dataloader.sc-test",
      "externalId": "sc-dataloader",
      "region": "us-east-1"
    }

    For more information, see topic Creating a role for an IAM user.

    External IDIt defines the External ID.
    Note: When you can clear the Authenticate using key/secret checkbox and you can specify the External ID.
    S3 file pathThe directory or the S3 path for the input file. S3 filepath is the AWS S3 bucket location from where the dataload file is located.
    S3 file maskThe file mask defines the part of a filename. By providing a file mask, it loads only those files with filenames containing the file mask text. The filenames that do not contain the file mask text are ignored. While using file mask, the file path must be the directory path (folder location where the files are present) and not the full path that includes the filename.
    Region InformationThe information about the region. Example: us-west-1

    Specify an upload file from GCS:

    On clicking Google Cloud Storage, the credentials that must be specified for the GCP source are displayed:

    Enter the mandatory fields (*) for GCS Credentials.
    Table 2. GCS Credentials
    CredentialsDescription
    Account NameThe GCS account name.
    Bucket nameThe name of the bucket.
    File pathThe directory path. It is alphanumeric.
    File maskThe file mask defines the part of a filename. By providing a file mask, it loads only those files with filenames containing the file mask text. The filenames that do not contain the file mask text are ignored. While using file mask, the file path must be the directory path (folder location where the files are present) and not the full path that includes the filename.
    Project-IDThe Project ID. It is alphanumeric. It is the Job Definition ID, which is required to provide details for creating and running a Data Loader job. This ID is unique. It can contain information related to multiple Jobs and other job-related details. For example, storage, mapping, and so on.
    Project KeyThe Project Key.
    Project Key IDThe ID. It is a numeric field.
    Client IDThe Client ID.
    Client EmailThe Email ID of the client.

    Specify an upload file from Microsoft Azure Blob Storage:

    On clicking Azure Blob Storage, the credentials that must be specified for the Azure source are displayed. The fields highlighted are the required fields to pull a data file from Azure Blob Storage and Load it into Reltio:

    Table 3. Azure Blob Storage Credentials
    CredentialsDescription
    Azure Account NameThe Azure account name is the Azure storage account name on your Microsoft Azure account homepage.
    Access KeyThe Access key can be found under Access keys within your Azure storage account. Once your storage account has been selected, you can find this on the left view panel.
    Container NameThe Container name is the name of the Container you created under your Azure storage account.
    File pathThe file path will reside within your Azure container. If you did not create any additional folders, it will simply be the file name. Otherwise, you will need to specify the file path.
    File maskThe file mask defines the part of a filename. By providing a file mask, it loads only those files with filenames containing the file mask text. The filenames that do not contain the file mask text are ignored. While using file mask, the file path must be the directory path (folder location where the files are present) and not the full path that includes the filename.

    For more information, see Set up Azure Blob Storage mapping.

    Specify an upload file from SFTP:

    On clicking SFTP, the credentials that must be specified for the SFTP source are displayed.

    Enter the mandatory fields (*) for SFTP Credentials.
    Table 4. SFTP Credentials
    CredentialsDescription
    Account NameThe SFTP account name.
    SFTP usernameThe username. For example, reltio-dataloader.
    SFTP passwordThe SFTP password. It is alphanumeric.
    SFTP hostThe SFTP hosts.
    SFTP file pathThe file path.
    SFTP file maskThe file mask defines the part of a filename. By providing a file mask, it loads only those files with filenames containing the file mask text. The filenames that do not contain the file mask text are ignored. While using file mask, the file path must be the directory path (folder location where the files are present) and not the full path that includes the filename.
  4. Click the checkbox Save the source settings.
    Note: The option for saving the source settings will only be displayed for remote settings. For example, Amazon S3, Google Cloud Storage, and Azure Blob Storage, and SFTP. You cannot see this option for local settings.

  5. Select CONTINUE.

    After you load entities into a tenant, you need to map the file's columns to interaction attributes. For more information, see topic Map File Columns to Attributes.