New Pipelines Data Destination: Kafka

A new Pipelines data destination for Kafka is now generally available for Golioth users. Kafka is an open-source distributed event streaming platform used by thousands of companies for high-performance data pipelines, streaming analytics, data integration, and mission-critical applications. There are many cloud-hosted services offering Kafka or Kafka-compatible APIs.

How It Works

Similar to the existing gcp-pubsub and azure-event-hubs destinations, the kafka data destination publishes events to the specified topic. Multiple brokers can be configured, and PLAIN, SCRAM-SHA-256, SCRAM-SHA-512 SASL mechanisms are supported for authentication. All data in-transit is encrypted with Transport Level Security (TLS).

Data of any content type can be delivered to the kafka destination. Metadata, such as Golioth device ID and project ID, will be included in the event metadata, and the event timestamp will match that of the data message.

filter:
  path: "*"
steps:
  - name: step0
    destination:
      type: kafka
      version: v1
      parameters:
        brokers:
          - my-kafka-cluster.com:9092
        username: kafka-user
        password: $KAFKA_PASSWORD
        topic: my-topic
        sasl_mechanism: SCRAM-SHA-256

Click here to use this pipeline in your Golioth project!

For more details on the kafka data destination, go to the documentation.

What’s Next

Kafka has been one of the most requested data destinations by Golioth users, and we are excited to see all of the new platforms that are able to be leveraged as part of the integration. Keep an eye on the Golioth blog for more upcoming Pipelines features, and reach out on the forum if you have a use-case that is not currently well-supported!

Dan Mangum
Dan Mangum
Dan is an experienced engineering leader, having built products and teams at both large companies and small startups. He has a history of leadership in open source communities, and has worked across many layers of the technical stack, giving him unique insight into the constraints faced by Golioth’s customers and the requirements of a platform that enables their success.

Post Comments

No comments yet! Start the discussion at forum.golioth.io

More from this author

Related posts

spot_img

Latest posts

Useful Zephyr Shells for IoT Development

The Zephyr shell subsystem will help you directly interact with and troubleshoot your IoT hardware. This post details our most commonly used commands, as well as a listing of all Zephyr shell modules that we could extract from a recent project.

Guide to Securely Store Credentials on an nRF91 Modem

The Nordic nRF91 modems include secure storage for TLS credentials. This may be used to authenticate with Golioth. The assets are stored separately from the firmware, and once written, they cannot be read back from the device. This guide shows the process of storing and using credentials.

Adding sound to the Aludel Elixir based Reference Designs

This post highlights moving code from one Zephyr project to another and all the considerations for code portability.

Want to stay up to date with the latest news?

Subscribe to our newsletter and get updates every 2 weeks. Follow the latest blogs and industry trends.