Unify and manage your data

Configure Data Sharing with Databricks using APIs

Learn how to create and configure a Data Share adapter for Databricks using REST APIs.

Prerequisites:

  • Valid tenant ID and environment
  • Authentication token with the ROLE_DATA_PIPELINE_ADMIN role
  • Valid Databricks delta sharing identifier
  • An active subscription for Data Sharing with Databricks
  1. Create a new Databricks adapter

    Send a POST request to the DPH service to define a new Databricks data share:

    POST {dphUrl}/api/tenants/{tenantID}/adapters

    Example request body:

    
    {
      "type": "datashare-databricks",
      "name": "datashare",
      "enabled": true,
      "createdBy": "user@example.com",
      "createdOn": "2025-06-09T10:05:07.864379Z",
      "isOvOnly": true,
      "databricksConfig": {
        "identifier": "identifier1"
      }
    }
              

    The createdOn timestamp is auto-generated. The field isOvOnly determines whether only operational values (OV) are shared. If set to true, the schema is also simplified to a flat format.

    Field reference:

    ParameterTypeDescription
    typeStringAdapter type: must be datashare-databricks
    nameStringAdapter name (alphanumeric, 3–20 characters, no spaces)
    enabledBooleanEnable or disable the adapter
    createdByStringUsername of adapter creator
    createdOnStringAdapter creation time in UTC format
    isOvBooleanWhether to share only operational values and use simplified schema
    databricksConfigObjectDatabricks specific config
    databricksConfig.identifierStringIdentifier for the Databricks delta share
  2. Set up the data share

    Provision the cloud resources and enable the share using the adapter name:

    
    POST {dphUrl}/api/tenants/{tenantID}/adapters/{adapterName}/actions/setup
              

Your data share adapter is now set up and ready for Databricks users to access the shared Delta Lake tables through their Databricks Unity Catalog under Delta Sharing