A new Pipelines transformer for Base64 encoding and decoding is now generally available for Golioth users. The base64 transformer is useful for working with sources and destinations in which working with binary data is difficult or unsupported.
How It Works
By default, the base64 transformer encodes the message payload as Base64 data. In the following example, the data is delivered to the recently announced aws-s3 data destination after being encoded. The content type following encoding will be text/plain.
filter:
path: "*"
steps:
- name: step0
transformer:
type: base64
destination:
type: aws-s3
version: v1
parameters:
name: my-bucket
access_key: $AWS_ACCESS_KEY
access_secret: $AWS_ACCESS_SECRET
region: us-east-1
Click here to use this pipeline in your Golioth project!
Supplying the decode: true parameter will result in the base64 transformer decoding data rather than encoding. In the following example, Base64 data is decoded before being delivered to the recently announced kafka data destination. The content type following decoding will be application/octet-stream.
filter:
path: "*"
steps:
- name: step0
transformer:
type: base64
parameters:
decode: true
destination:
type: kafka
version: v1
parameters:
brokers:
- my.kafka.broker.com:9092
topic: my-topic
username: pipelines-user
password: $KAKFA_PASSWORD
sasl_mechanism: PLAIN
Click here to use this pipeline in your Golioth project!
For more details on the base64 transformer, go to the documentation.
What’s Next
The base64 transformer has already proved useful for presenting binary data in a human-readable format. However, it becomes even more useful when paired with other transformers, some of which we’ll be announcing in the coming days. In the mean time, share how you are using Pipelines on the forum and let us know if you have a use-case that is not currently well-supported!


No comments yet! Start the discussion at forum.golioth.io