Confluent Kafka as an external streaming destination
Learn how Confluent Kafka enables Reltio to stream events directly to a cluster as part of the external data streaming capability.
Confluent Kafka is supported as an external streaming destination in Reltio Data Streaming. This capability allows Reltio to publish event messages directly to a Kafka cluster, using the same streaming configuration model as other external destinations.
In Reltio, external streaming delivers CRUD and Match events from internal processing queues to external systems in near real time. Confluent Kafka extends this capability by enabling direct delivery to Confluent Kafka topics instead of relying on intermediary queue services.
How Confluent Kafka streaming differs from other destinations
Traditional external queue integrations use cloud messaging services such as Amazon SQS, Google Pub/Sub, or Azure Service Bus. In these integrations, messages are delivered to a queue or topic managed by the cloud provider.
Confluent Kafka uses a different model based on topics and partitions. Instead of consuming and removing messages from a queue, downstream systems read events from Confluent Kafka topics independently. Events can be retained and consumed multiple times depending on the Confluent Kafka configuration.
Although Confluent Kafka uses a different underlying model, it is configured and managed in Reltio as an external streaming destination, similar to other providers.
When to use Confluent Kafka
Use Confluent Kafka as a streaming destination when your organization uses Kafka as a central event streaming platform or data pipeline backbone to:
- Stream events directly into existing Kafka-based data platforms
- Eliminate the need for intermediate queue-to-Kafka connectors
- Integrate with multiple downstream consumers that subscribe to Confluent Kafka topics
- Support high-throughput event processing pipelines
How Confluent Kafka fits into data streaming
The Reltio data streaming process remains unchanged when using Confluent Kafka. Events are generated from platform operations, processed through internal queues, and then delivered to the configured external destination.
When Confluent Kafka is configured as the destination, the message streaming service publishes events directly to Confluent Kafka topics instead of sending them to cloud queue services.
All existing streaming features apply to Confluent Kafka destinations, including:
- Event type filtering (
typeFilter) - Object filtering (
objectFilter) - Payload types (snapshot, delta, snapshot with delta)
- Message formats (JSON or compressed JSON)
Next steps
To configure Confluent Kafka as a streaming destination and understand how event streaming works, see Add an external queue configuration and Data streaming operation.
To learn how streaming providers are defined and configured, including URI-based configuration, see Message Streaming Provider.
For detailed information about Confluent Kafka configuration options and authentication types, see Confluent Kafka messaging provider configuration.
To understand the structure and content of streamed events, including CRUD and Match events, see Message Streaming Format.