Load entities into a tenant
Learn about how to use the Data Loader to load entity data types into a tenant.
Let's cover how to load entity data types from a selected file into a tenant in the Reltio Connected Data Platform. For information about loading relationship and interaction data types, see topics Load relationships into a tenant and Load interactions into a tenant.
The Data Loader enables you to load entities from various data source locations. For information about the locations we support, see topic Supported data sources and file types.
- Open your tenant in Data Loader.
- From the Applications menu, select Console.
- From the Tenant Management section, select the Data Loader application.
- In the Job Definition tab, select LOAD DATA and then Entities to create a new data load job.
- On the Entities data load page, in the Upload section, fill in the following details and then select Continue:Job definition Name: Enter the name of the job under Job definition Name.
Entity type: Select the entity type corresponding to the data in the file that you’re loading. This selection must be based on the entity type of the data in the file to be used in the job. For example, Organizations, Products, Contacts, and so on.
Select file: Specify the file to upload:
- Select the source:
- My Computer (the default)
- Amazon S3
- Google Cloud Storage (GCP)
- Microsoft Azure Blob Storage
- SFTPNote: When you upload a file from the local file system, the file size must not exceed 50 MB.
- Specify the file or account details:
Specify an upload file from My Computer:
- Drop your file on the page
or
Select the file:-
- Select file type:
- CSV:
- Excel (.xlsx): Select or clear the First row as a header checkbox.
- JSON. For more information, see topic JSON file loading example.
- RELTIO_JSON. For more information, see topic Load data from a RELTIO_JSON file.
- Select BROWSE FILE.
- Specify the file delimiter type separate columns/values for a CSV file:
- Comma (,)
- Semicolon (;)
- Single pipe (|)
- Double pipe (||)
- Select or clear the First row is a header checkbox for CSV or Excel files. Note: This determines whether mappings will be displayed with or without a header row. For more information, see Map File Columns to Attributes.
- Select file type:
-
- Drop your file on the page
Specify an upload file from Amazon S3:
On clicking Amazon S3, the credentials that must be specified for the Amazon S3 source are displayed:
Enter all the mandatory fields (*) for Amazon S3 Credentials.Table 1. Amazon S3 credentials Credentials Description Account Name The Amazon S3 account name. Bucket Name Name of the bucket or the folder where the file is located. Authenticate using key/secret If enabled, this checkbox provides the user with the option to authenticate using stored credentials versus an AWS IAM role. To use this option, check the Authenticate using key/secret checkbox and specify the AWS Key and AWS Secret.
The API Key and Secret enable you to authenticate with and access the S3 bucket.
AWS Key The credentials for the AWS account root user. It is needed to fetch a file from the AWS S3 bucket, which is consumed by the Data Loader job. It’s alphanumeric. It defines the AWS key for the input file. AWS Secret The AWS secret key or the AWS password. It’s alphanumeric. It defines the AWS secret for the input file. Role To authenticate using an AWS IAM role, instead of stored credentials, clear the Authenticate using key/secret checkbox and provide the required values for Role and External ID. We recommend that you use an AWS IAM Role. It's a more secure way to provide access to files in anS3 bucket.
The value for Role has to be provided in a format similar to
arn:aws:iam::*:role/reltio.client.dataloader.*
For example:{ "role": "arn:aws:iam::634947810771:role/reltio.client.dataloader.sc-test", "externalId": "sc-dataloader", "region": "us-east-1" }
For more information, see topic Creating a role for an IAM user.
External ID It defines the External ID. Note: When you can clear the Authenticate using key/secret checkbox and you can specify the External ID.S3 file path The directory or the S3 path for the input file. S3 filepath is the AWS S3 bucket location from where the dataload file is located. S3 file mask The file mask defines the part of a filename. By providing a file mask, it loads only those files with filenames containing the file mask text. The filenames that do not contain the file mask text are ignored. While using file mask, the file path must be the directory path (folder location where the files are present) and not the full path that includes the filename. Region Information The information about the region. Example: us-west-1 Specify an upload file from GCS:
On clicking Google Cloud Storage, the credentials that must be specified for the GCP source are displayed:
Enter the mandatory fields (*) for GCS Credentials.Table 2. GCS Credentials Credentials Description Account Name The GCS account name. Bucket name The name of the bucket. File path The directory path. It is alphanumeric. File mask The file mask defines the part of a filename. By providing a file mask, it loads only those files with filenames containing the file mask text. The filenames that do not contain the file mask text are ignored. While using file mask, the file path must be the directory path (folder location where the files are present) and not the full path that includes the filename. Project-ID The Project ID. It is alphanumeric. It is the Job Definition ID, which is required to provide details for creating and running a Data Loader job. This ID is unique. It can contain information related to multiple Jobs and other job-related details. For example, storage, mapping, and so on. Project Key The Project Key. Project Key ID The ID. It is a numeric field. Client ID The Client ID. Client Email The Email ID of the client. Specify an upload file from Microsoft Azure Blob Storage:
On clicking Azure Blob Storage, the credentials that must be specified for the Azure source are displayed. The fields highlighted are the required fields to pull a data file from Azure Blob Storage and Load it into Reltio:
Table 3. Azure Blob Storage Credentials Credentials Description Azure Account Name The Azure account name is the Azure storage account name on your Microsoft Azure account homepage. Access Key The Access key can be found under Access keys within your Azure storage account. Once your storage account has been selected, you can find this on the left view panel. Container Name The Container name is the name of the Container you created under your Azure storage account. File path The file path will reside within your Azure container. If you did not create any additional folders, it will simply be the file name. Otherwise, you will need to specify the file path. File mask The file mask defines the part of a filename. By providing a file mask, it loads only those files with filenames containing the file mask text. The filenames that do not contain the file mask text are ignored. While using file mask, the file path must be the directory path (folder location where the files are present) and not the full path that includes the filename. For more information, see Set up Azure Blob Storage mapping.
Specify an upload file from SFTP:
On clicking SFTP, the credentials that must be specified for the SFTP source are displayed.
Enter the mandatory fields (*) for SFTP Credentials.Table 4. SFTP Credentials Credentials Description Account Name The SFTP account name. SFTP username The username. For example, reltio-dataloader. SFTP password The SFTP password. It is alphanumeric. SFTP host The SFTP hosts. SFTP file path The file path. SFTP file mask The file mask defines the part of a filename. By providing a file mask, it loads only those files with filenames containing the file mask text. The filenames that do not contain the file mask text are ignored. While using file mask, the file path must be the directory path (folder location where the files are present) and not the full path that includes the filename. - Select the source:
- Click the checkbox Save the source settings.Note: The option for saving the source settings will only be displayed for remote settings. For example, Amazon S3, Google Cloud Storage, and Azure Blob Storage, and SFTP. You cannot see this option for local settings.
- Select CONTINUE.
After you load entities into a tenant, you need to map the file's columns to interaction attributes. For more information, see topic Map File Columns to Attributes.