This guest post is contributed by Ben Mawbey, a community member who is active on the Golioth Discord and frequently takes part in Office Hours.

Data wants to be visualized. The impact of showing a customer a slick plot of the information their devices have been collecting is massive compared to pointing at a few hundred lines of text from a log file or database query.

I was looking for some sort of dashboard or charting application for a demo for a sensor system we’ve built. I figured this will appease the clients’ need for pretty pictures over boring reports. I found exactly what I needed, and it only took me about 30 minutes to get it up and running.

Grafana graphing data

This looks great, even if you have no idea what the graphs mean! Let’s dig into how to get from numbers in a database to pretty pictures in a dashboard.

Pieces of the Puzzle: Golioth WebSockets, Node-RED, InfluxDB, and Grafana

Golioth is brilliant at getting your device data from the real world up to the cloud, climbing the IoT bean-stalk some would say. While abstracting some of the trickiest IoT problems, Golioth can present your time-series data as a convenient cloud resource using LightDB Stream service. By leveraging the built-in WebSockets support and some open-source tools we can rapidly store, manipulate and display this data!

Grafana is the open-source blockbuster for this application and can be easily set up to graph sensor data directly from Golioth LightDB Stream using the REST API. This has two major drawbacks:

  • Grafana must periodically poll for new data
  • While LightDB Stream does provide convenient data retention, I prefer to use my own data storage

Enter InfluxDB, the time-series database powerhouse and an ideal companion to Grafana when talking about IoT real-time applications. This pairing is so popular that the InfluxDB data source integration is baked right into Grafana! By utilizing InfluxDB to store our sensor data, we can perform more complex queries much faster.

The question remaining is how to shuffle data from Golioth to InfluxDB? There are many potential solutions to this hurdle, but my favorite is Node-RED which defines itself as a low code programming tool for wiring up event-driven systems.

Node-RED editor window

Node-RED uses graphical flows to connect data sources and destinations, bending protocols and translating data formats in innumerable ways

Node-RED has exploded in popularity and provides all sorts of integrations to connect your systems together. It provides simple blocks to perform actions and a slick graphical interface to wire up your data flows. Conceptually Node-RED acts as our rule engine to process and direct data.

Dashboards: Grafana and InfluxDB

System diagram

Grafana is immensely powerful at providing example custom views, data transformations, and alerting. That said, it is only as good as the quality of data you can provide. Having a tightly coupled InfluxDB instance with carefully curated data via Node-RED, allows you to quickly configure complex data queries on large datasets with low latency.

Before we can play with Grafana the first step is InfluxDB setup. After you’ve installed InfluxDB, create a new database on your InfluxDB instance:

> CREATE DATABASE golioth

Configure the InfluxDB Data Source in your Grafana instance by clicking the gear icon on the left sidebar, choosing and adding a data source, then searching for the InfluxDB plugin. Here’s how I’ve set up my data source:

Grafana InfluxDB source configuration

Assuming you have some data in your DB, we can quickly create a new time-series dashboard panel in Grafana and query the dataset using this integration:

Grafana panel configuration

This simple query shows how we structured the data in the DB, allowing us to select from a particular measurement, specifying a specific device identity tag and aggregating data points with specific time buckets. Adjusting the time range instantly updates the graph from our local InfluxDB instance.

Now how do we get the data from Golioth into InfluxDB?

Node-RED

Several networking integrations are provided with Node-RED. Presently the most relevant to Golioth would be HTTP nodes for REST API requests and the WebSockets node which is the easiest to configure.

You can see your sensor data collecting in the Golioth LightDB Stream by using the Golioth Console web interface:

Golioth Console showing LightDB stream data

We can use Node-RED flows to connect to Golioth via WebSockets and store the resulting data in our local DB:

NodeRED editor window

The nodes in this flow were set up as follows, taking care to give them appropriate names making it obvious to see their function. All of the nodes I’m using should come as part of every Node-RED installation except for the InfluxDB nodes. But don’t worry, these are trivial to install. On Linux is looks something like this:

cd ~/.node-red
npm install node-red-contrib-influxdb
sudo systemctl restart nodered.service

WebSockets Node:Node-RED websockets

First set up the credentials to your Golioth project using your generated API key and connect to the WebSockets LightDB Stream endpoint.

Debug Node:Node-RED debug

Drop a few of these nodes along the way and click the small green button to turn the debug log on. This is super handy to check data coming through and make sense of it.

JSON Node:Node-RED json

The LightDB Stream endpoint provides us with a JSON object representation containing our sensor data as well as meta information such as the data timestamp and device identity. This node allows us to parse this JSON into a javascript object so that we can work with it more easily in subsequent nodes.

Change Node:Node-RED change node

This node clearly shows the power of Node-RED as we can craft any sort of data manipulations or transformation.

We could do without this node and jump straight to InfluxDB, however, should any malformed data arrive we risk polluting the DB with bad data. By selectively transforming the incoming data and mapping it into a new object we can not only filter only good packets and arrange the measurement names but also add tags to build a solid data representation in the DB making our queries far more powerful.

InfluxDB Out Node:

Node-RED InfluxDB input nodeNode-RED InfluxDB server settings window

Finally, we can configure the data connection to our InfluxDB instance, set appropriately for your server configuration and database created earlier.

Assuming your flow is set up correctly, you should be able to see data collecting in your database. We know it works, but as I mentioned before, this visual is not going to impress our customers.

InfluxDB data

Revisiting the Grafana panel we previously created, you can see InfluxDB data is now being plotted!

Grafana graphing data

Corner Cases

One of the downsides of WebSockets is their ethereal nature, should there be some temporary connectivity issue, any data packets would be lost from the point of view of your InfluxDB database. A solution around this could be to set up another flow that executes periodically to sync with the LightDB Stream using the REST API. Node-RED could then be configured to check this data and add any missing values into the InfluxDB instance and prevent consistency issues.

Another concern with open-source self-hosted systems is security. It can be challenging to secure your server and services should they be public-facing. If you are handling sensitive data then it would be best to consult with an expert in this field. Fortunately, all of the tools discussed have subscription-based cloud services available that sort all of this out in the background.

Conclusion

Being able to set up a simple demo like this in less than 30 minutes demonstrates the power and flexibility of these modern open-source solutions. Coupled with the reliability and maintenance advantages of the Docker system, it’s a breeze to test locally on your desktop or Raspberry Pi and then deploy to production a moment later on your cloud server of choice. The rules engine and ease of wiring up blocks provided by Node-RED opens up a massive pool of possibilities, from countless other integrations to building intelligent processes. One such idea I would like to explore is integrating a device provisioning process into the flow such that we can link a device to a dataset or location during deployment or maintenance.

Team members Lachlan and Chris discuss an implementation of a Rust code sample with Golioth and the Nordic Semiconductor nRF9160. There is a demo of the code working in the video attached below.

What is Rust?

Rust is a high-level, general-purpose programming language, syntactically similar to C and C++. It is fast and memory-efficient and is specifically designed for performance and safety. It has no runtime or garbage collector, while taking advantage of the higher-level concepts and safety guarantees. Rust can run on embedded devices, and can be easily integrated with other languages, and is ideal for performance-critical services. Rust is still in early days of development, starting in 2010 and only coming to embedded devices a few years ago. The developer community around Rust is passionate and growing. While Rust and C/C++ share similarities, Rust is relatively faster than C/C++ in some cases. Rust also has the benefit of a package management system, which makes implementing different libraries easier than C (which existed long prior to the internet). Easy package management encourages code reuse by allowing libraries to be readily integrated into applications, not to mention building the overall community around Rust.

Why we’re trying it

Golioth was interested in trying out Rust as part of our Golioth Labs segment on GitHub, where the code is currently hosted. It represents one of many trials we have and will take on in order to showcase the Golioth platform. We are committed to showing ways to access Golioth services and APIs outside of our normal recommended path Golioth SDK based on Zephyr RTOS. Not only does this represent the reality for many users, it also showcases how Golioth services are accessible from many different programming paradigms, in addition to a wide range of hardware. Another good example from Golioth Labs is our Arduino SDK discussed in our last article.

We chose to try Rust on the nRF9160 because the hardware represents a key product on the platform. This Nordic part has great support from Golioth and Nordic Semiconductor, and is listed as “Verified + Quickstart” on the Golioth Hardware Catalog. While the quickstart is using Zephyr to get users up and started, it is a well-supported part internally at Golioth. The nRF9160 is also primed for Rust integration: Nordic Semiconductor provides a library for the NRF devices that exposes the API which communicates with the hardware. There is also a Rust binding for it which makes it quite easy to implement the rust code and enables you to connect to Golioth servers and send and receive data. In addition, Rust has some code abstraction features and doesn’t require a debugger.

Video explanation and demonstration

In the demo below, Lachlan shows how he is able to use the library and bindings to connect over CoAP to the Golioth servers and replicate many of the functions that exist in Golioth samples.

Note: The arduino-sdk repository showcased in this post is deprecated. GoliothLabs has two experimental repositories that may work as replacements:

In this post and associated video, we’re talking about interfacing hardware to Golioth using popular tools like PlatformIO and Arduino Core (API). Alvaro shows how to use an ESP32 board from Adafruit with the Arduino Core on PlatformIO to talk to the Golioth MQTT endpoint. Golioth has a range of tools and resources for hardware and firmware engineers to get their devices talking to the cloud.

Why PlatformIO?

PlatformIO is a plugin system built on top of VS Code that enables embedded developers to get up and running quickly with different Real Time Operating Systems (RTOS), chipsets, and boards. There is a library manager to help manage dependencies within projects, which removes much of the friction at the beginning of development projects. Currently they do not support Zephyr for ESP32 (a combination previously shown on this blog/channel), but they do support things like the ESP-IDF (based on FreeRTOS) and the Arduino core. Once you pull in the project files from the Golioth Arduino SDK page, PlatformIO will try to download all of the dependencies required to build firmware for a particular board.

What is Arduino Core?

The Arduino Core refers to the APIs that are available on different hardware devices in order to conform to the Arduino ecosystem. In the example below, it refers to Espressif’s Arduino Core port, which means that the ESP32 functions will align with what the Arduino IDE might be asking of that hardware. If you are using platforms outside of the Arduino IDE, you can still interact with many of the underlying functions that were written by hardware and firmware developers to make boards work with Arduino software.

PlatformIO also pulls in other libraries that are useful for processing data coming out of the hardware, such as the Arduino JSON library. So while it’s not the “teal and white IDE” that many people expect, it’s utilizing much of the same work that enabled people to work from within the IDE.

Golioth’s MQTT endpoint

MQTT is now available in preview on Golioth, but should have a similar user experience as other protocols on the platform. Since the early days of Golioth, we have offered device services over CoAP. We chose CoAP as a first offering because it is meant for low-power devices in potentially lossy environments, like cellular networks. MQTT is a popular transport protocol that many engineers use to initially connect their devices to the internet, only to need additional tooling on other platforms. From Golioth’s standpoint, the user doesn’t need to care about which one they’re using. Instead, we provide device services like LightDB, which allows the user to push and pull configuration data and state-based sensor readings from a device. LightDB stream allows for time-series data that needs to be aggregated and displayed on something like a chart. Golioth has other features like firmware updates and logging, that also operate over these protocols.

Showcase video

In the video below, Alvaro shows how the new MQTT endpoint and services from Golioth enable a similar experience that is shown using CoAP in other demos on the platform. What is different now is that we can utilize code from other platforms like Arduino Core, which already had MQTT libraries built-in. Alvaro showcases how he was able to quickly build an Arduino SDK for the Golioth Labs GitHub.