Docker is one of those tools that have the potential to fundamentally transform how you develop and run software – once you have tried Docker it is hard to imagine going back to something else.
In previous posts we have seen how Butler, Butler SOS and Butler CW can be run as Docker containers.
But we can do even better – why not control all the Butler tools from a single docker-compose file? Maybe even specifying the dependencies on influxdb and mqtt in there too?
Setting this up is incredibly easy – a single docker-compose file tells Docker what containers to use, and some config files tells the Butler tools where to find things.
Following up on previous posts (here and here) about the Butler family of tools being Dockerized, here is another one on the same topic:
Butler SOS can now be run in a Docker container.
This is good news as it makes it a lot easier to set up real-time monitoring of a Qlik Sense enterprise environment, compared to the previous (still working, btw) method of installing Node.js and then running Butler SOS on top of Node.
Following up on the previous post about pulling sensor data from sensors in a Verisure alarm system, here is a slightly more hardware focused post.
One use case that the new Verisure alarm system does not handle nicely is machine-to-machine notification when the alarm is disarmed (turned off). Such notifications are sent to the mobile app – so the data exists somewhere. The problem is that it’s all undocumented and thus hard to build something on top of.
Let’s hack a solution instead – be aware, some soldering required….
I recently upgraded our house alarm system to the latest model from Verisure.
Pretty nice actually, the new system comes with a good mobile app, remote querying of both alarm status and status of the various sensors (smoke, movement etc) that are included in the system. Oh, the new system actually cost less per month too, compared to the old one. Go figure.
Being the geek that I am, I immediately wondered if it was possible to get hold of the sensor readings (temperature and humidity) that were shown in the mobile app.
Turns out it is pretty simple. There is a REST API which can be queried, and from there you get back all kinds of status for the system. With that data nicely structured in a JSON it’s then a breeze to send the data to a MQTT pub-sub broker for later use by whatever system that might need it. Easy in theory.
Before the summer the Butler tool turned two years old – time flies!
Over those years I have installed, tweaked and upgraded a fair number of Butler instances… Not a problem per se, but maintaining a production grade Butler instance does assume a certain level of experience around Node.js, Linux, networking etc.
The most recent Butler version (v2.2) attempts to make it easier to deploy and operate Butler. This is achieved by deploying Butler as a Docker container instead of a regular Node.js app.
The Docker image (from which a container is created) contains exactly the same Node.js app that you can run right on your server or laptop – i.e. there is no functional difference what so ever between running the Node app natively, and running it as a Docker container.
There are some significant benefits of running Butler under Docker:
I worked out the issue preventing me to add additional nodes to a Qlik Sense Enterprise suystem whose central node had been updated to June 2018.
The June 2018 version of Qlik Sense Enterprise relies on a database called “SenseServices” to be available in the Postgres db (as in earlier versions, Postgres can run on the central node or on some other server).
When running the June 2018 installer on a new server (which is to be added as a new node in a Sense Enterprise cluster), the installer verifies that the SenseServices database exists. If it does not, you get an error half-way through the installer:
Maybe there is info somewhere that you need to create a new “SenseServices” db in Postgress before installing additional Sense nodes… but I looked (a lot) without finding anything.
Qlik’s annual Qonnections conference in Orlando wrapped up yesterday.
Three days packed with the latest from both the Qlikosphere in general and Qlik in particular.
I had more session conflicts (where I wanted to attend several interesting sessions at the same time) than during any previous year, which is a sign of interesting topics, hopefully also of high quality sessions.