Unify and manage your data

Configure LLM configurations in Unstructured Data Studio

Learn how the LLM Configuration tab connects Unstructured Data Studio to external large language models (LLMs) for AI-driven document processing.

The LLM Configuration tab allows you to connect Unstructured Data Studio to third-party AI providers like OpenAI, Anthropic. Each configuration enables you to define the model, API key, and operational parameters such as token limits and temperature. These settings enable prompt execution during document processing workflows.

Each provider is managed in a separate tab. Here's a LLM Configuration screenshot:

You can do the following actions for a configuration:

  • Edit the configuration

  • Test the API connection

  • Delete the configuration

Select the +Add Configuration button to create a new LLM configuration for an AI provider. The following section explains how to add an OpenAI configuration.

Add an OpenAI configuration

You can securely connect and configure an OpenAI model for use in document processing.

To add a new configuration:

  1. In the LLM Configuration tab in the Unstructured Data Studio, select the OpenAI tab.

  2. Click + Add Configuration.

  3. In the Configuration Name field, enter a unique label to identify the configuration.

  4. In the OpenAI API key field, enter your secret key and select Validate to confirm whether the key is active. .

  5. In the Model field, select a supported model. For example, gpt-4.

  6. Use the slider in the Max tokens field to limit the response size.

  7. Use the slider in the Temperature field to adjust the randomness of the model's responses,.

  8. Select the Active checkbox to enable it.

  9. Select Save. The configuration is saved and listed under the selected provider tab.