New Pipelines Data Destination: Kafka
A new Pipelines data destination for Kafka is now generally available for Golioth users. Kafka is an open-source distributed event streaming platform used by thousands of companies for high-performance data pipelines, streaming analytics, data integration, and mission-critical applications. There are many cloud-hosted services offering Kafka or Kafka-compatible APIs.
How It Works
Similar to the existing gcp-pubsub
and azure-event-hubs
destinations, the kafka
data destination publishes events to the specified topic. Multiple brokers can be configured, and PLAIN
, SCRAM-SHA-256
, SCRAM-SHA-512
SASL mechanisms are supported for authentication. All data in-transit is encrypted with Transport Level Security (TLS).
Data of any content type can be delivered to the kafka
destination. Metadata, such as Golioth device ID and project ID, will be included in the event metadata, and the event timestamp will match that of the data message.
filter: path: "*" steps: - name: step0 destination: type: kafka version: v1 parameters: brokers: - my-kafka-cluster.com:9092 username: kafka-user password: $KAFKA_PASSWORD topic: my-topic sasl_mechanism: SCRAM-SHA-256
Click here to use this pipeline in your Golioth project!
For more details on the kafka
data destination, go to the documentation.
What’s Next
Kafka has been one of the most requested data destinations by Golioth users, and we are excited to see all of the new platforms that are able to be leveraged as part of the integration. Keep an eye on the Golioth blog for more upcoming Pipelines features, and reach out on the forum if you have a use-case that is not currently well-supported!
Start the discussion at forum.golioth.io