Create D&B Batch Job
Learn about the operation to create D&B batch jobs.
The create D&B batch job
operation creates a new job to send data to D&B. The batch job sends data to D&B by using Direct 2.0 or Direct+ API. To Login to Direct 2.0, the D&B-User and D&B-Password must be the FTP credentials and to login to Direct+ you need to use the API Direct+ credentials. You can create periodic or deferred batch jobs by using cronExpression
header parameter.
forceS3
is an optional parameter to load files directly from Amazon S3. Request
POST {DnBConnectorUri}/batch
Parameter | Required | Required | Description | |
---|---|---|---|---|
Headers | Authorization | Yes | Yes | Reltio access token in the form Bearer: <<token>> , seeAuthentication API.Note:
|
Content-Type | Yes | Yes | Must be Content-Type: application/json . | |
EnvironmentUrl | Yes | Yes | Environment URL of the tenant. For example: https://dev.reltio.com | |
TenantId | Yes | Yes | Tenant ID: data is extracted from this tenant. Value: {{tenantID}} . | |
DnB-Ftp | No | N/A | DnB FTP host For example: | |
DnB-Path-Put | No | N/A | Put directory in the DnB FTP site Note: The default value is /puts . | |
DnB-Path-Get | No | N/A | Get directory in the DnB FTP site Note: The default value is /gets . | |
DnB-User | Yes | Yes | DnB FTP login | |
DnB-Password | Yes | Yes | DnB FTP password | |
notificationEmails | No | No | Enables email notifications for FAILED job cases. You can add email addresses separated by comma (,) to the list of recipients for receiving the notifications. For example, smith.joe@company.com,john.snow@company.com | |
s3Bucket | No | No | S3 bucket if you use S3. For example: s3.testbucket Note: You must pass all S3 parameters. | |
s3Path | No | No | S3 path. For example: batch/aaa/bbb Note: You must pass all S3 parameters. | |
awsAccessKey | No | No | AWS access key Note: You must pass all S3 parameters. | |
awsSecretKey | No | No | AWS secret key Note: You must pass all S3 parameters. | |
S3Region | No | No | S3 Region. You must pass all S3 parameters or nothing. The default value is us-east-1. | |
mergeOrgsWithSameDUNS | No | No | Sometimes the connector applies data from D&B DUNS number for the entity already occupied by another entity. If the flag is true , the connector updates and merges the entities in the result. Otherwise, the entity is marked with an error. The default value is | |
mergeOrgsWithSameDUNSByPotentialMatches | No | No | In the event of a URI mismatch, the operation is marked as successful, and entities presented as potential matches. To achieve this, add them to the tenant with a crosswalk value - duns/target uri .The default value is | |
minConfidenceCode | No | No | The D&B Connector returns confidence code for each processed entity in the range 1 to 10, where 10 means absolutely confident. All entities having the confidence code less than minConfidenceCode are marked with an error Low Confidence Score.The default value is | |
waitResponseSeconds | No | N/A | The D&B connector uploads requested files to D&B FTP site and periodically checks them for responses. If there are no responses after the waitResponseSeconds second, the requested files are marked as Expired.The default value is | |
cronExpression | No | No | D&B connector provides the ability of scheduling batch jobs by using standard cron expressions. A Cron expression expects time in the UTC time zone. Therefore, convert your local time to the UTC time zone before you start a job.
| |
productId | N/A | No | The D&B connector submits D&B job with a specified product identifier. The available values are cmpelf or cmpelk and the default value is cmpelf. | |
productVersion | N/A | No | The D&B connector submits D&B job with a specified product version. Please use the product version only with cmpelk product Id. The default value is v1 .For more information, see D&B Direct+ Documentation. | |
Query | force | No | No | Generally, you can have only one batch job for a tenant. A simultaneous batch job can interfere with each other and cause undefined behavior in the result. But for testing purposes or if you are confident working on independent entities, you can start multiple batch jobs with force=true . |
plus | No | No | If the value of this parameter is set to true , the connector starts the batch job as DnB Direct+ multiprocess API. The default value is false . | |
Body | Yes | Yes | Context JSON for the batch job. The batch job contains three main tasks: export, put files task, and get files task. You can start from any task by specifying the corresponding task. |
Response
Job identifier and the message about status or error.
Example 1: Batch job from the beginning - You have only the tenant and filter expression describes the entities that need enrichment.
Start a batch job from scratch with export data from the tenant. You must specify the export filter in the request.
Request
{
"exportTask": {
"filter": "(startsWith(attributes.Name,'R')))"
}
}
For parallel export in D&B Connector, please specify the following request parameters:
{
"exportTask": {
"filter": "(startsWith(attributes.Name,'T')))",
"distributed": true,
"taskPartsCount": 300
}
}
taskPartsCount
is negative or too large, it is passed to api.Example 2: You already have export started or completed and you would like to use it
Request
{
"putFilesTask": {
exportTaskIds": ["41949723-470b-4d4d-9c9f-4fa35de0447d"],
"processedEntities": 1000
}
}
The exportTaskIds
are ids of your export tasks. You can omit processedEntities
if you do not require offset from the beginning of the export file.
Example 3: You already have files on FTP/S3 and only need to upload them back to Reltio
Request
{
"getFilesTask": {
"fileEntries": [
{
"filename": "5741031244955648.ref",
"status": "UPLOADED",
"forceS3": true
},
{
"filename": "5741031244955648.glb",
"status": "UPLOADED"
},
{
"filename": "5741031244955648_1.glb",
"status": "UPLOADED"
}
]
}
}
These are the file entries to upload to Reltio. Use the status UPLOADED
which means the files were uploaded to FTP and the connector must check the responses. The first entry has the forceS3
flag as true
which means the connector downloads that file directly from S3.
forceS3
, you must have all S3 header parameters filled with valid values.Response
{
"jobId": 5741031244955648,
"success": "OK",
"message": "Scheduled"
}
Example 4: You already have D&B multiprocess jobs and only need to upload them back to Reltio
{
"getFilesTask": {
"fileEntries": [
{
"jobId": "some-exists-job-id"
"filename": "arbitrary file name",
"status": "UPLOADED",
"forceS3": false
},
{
"jobId": "some-DnB-API-job-id",
"status": "UPLOADED"
}
]
}
}
{
"jobId": 5741031244955648,
"success": "OK",
"message": "Scheduled"
}
Example 5: You need a periodic job or a job that starts in future
Any request from one of the previous examples plus the header parameter cronExpression
with a valid cron expression. For example, every Monday at noon expression will be: 0 0 12 ? * MON.