New Pipelines Data Destination: Kafka

A new Pipelines data destination for Kafka is now generally available for Golioth users. Kafka is an open-source distributed event streaming platform used by thousands of companies for high-performance data pipelines, streaming analytics, data integration, and mission-critical applications. There are many cloud-hosted services offering Kafka or Kafka-compatible APIs.

How It Works

Similar to the existing gcp-pubsub and azure-event-hubs destinations, the kafka data destination publishes events to the specified topic. Multiple brokers can be configured, and PLAIN, SCRAM-SHA-256, SCRAM-SHA-512 SASL mechanisms are supported for authentication. All data in-transit is encrypted with Transport Level Security (TLS).

Data of any content type can be delivered to the kafka destination. Metadata, such as Golioth device ID and project ID, will be included in the event metadata, and the event timestamp will match that of the data message.

filter:
  path: "*"
steps:
  - name: step0
    destination:
      type: kafka
      version: v1
      parameters:
        brokers:
          - my-kafka-cluster.com:9092
        username: kafka-user
        password: $KAFKA_PASSWORD
        topic: my-topic
        sasl_mechanism: SCRAM-SHA-256

Click here to use this pipeline in your Golioth project!

For more details on the kafka data destination, go to the documentation.

What’s Next

Kafka has been one of the most requested data destinations by Golioth users, and we are excited to see all of the new platforms that are able to be leveraged as part of the integration. Keep an eye on the Golioth blog for more upcoming Pipelines features, and reach out on the forum if you have a use-case that is not currently well-supported!

Dan Mangum
Dan Mangum
Dan is an experienced engineering leader, having built products and teams at both large companies and small startups. He has a history of leadership in open source communities, and has worked across many layers of the technical stack, giving him unique insight into the constraints faced by Golioth’s customers and the requirements of a platform that enables their success.

Post Comments

No comments yet! Start the discussion at forum.golioth.io

More from this author

Related posts

spot_img

Latest posts

Battery Monitoring with Zephyr’s Fuel Gauge Subsystem

Fuel gauge ICs offload effort when it comes to taking reliable battery readings and estimating charge and drain times. Zephyr's support for these components makes using them even easier.

Golioth Firmware SDK v0.18.0

Golioth released Firmware SDK v0.18.0 which pulls in the recent changes from upstream Zephyr, nRF Connect SDK, and ESP-IDF repositories. We are also introducing new gateway support, adding new supported boards, and improving blockwise transfers.

New Pipelines Data Destination: LightDB State

A new Pipelines data destination for LightDB State is now generally available for Golioth users and will enable a range of new bidirectional interactions with 3rd party services.

Want to stay up to date with the latest news?

Subscribe to our newsletter and get updates every 2 weeks. Follow the latest blogs and industry trends.