You are on page 1of 42

hi i'm jonathan oxa and this is super

house

one of the most common things you'll have to do in a home automation system
or any iot sort of project is take data from a sensor store it and then produce
some kind of a report or a chart

what we're going to be looking at here is output from the particular matter sensor
the air quality sensor

but it could be temperature or humidity or stock prices it really doesn't matter in


order to do

this we're going to need four different

software components

first we need something to receive the

messages from the sensor

we're going to use a message broker this

gives us a universal way of taking data

from all sorts of different sensors we

also need a database to store the data

for future reference so that we're not

just looking at the latest value

we can look back in time now typically

from the message broker you won't be

able to get the message directly

into the database so we need some kind

of a data bridge which is going to see

the data coming in

and then store it in the database

finally we need to do something with the

data so we're going to use a charting

system

we'll take the data from the database

and generate charts of the values over

time
each of these four building blocks has

many different alternatives that you

could use

for this example i'm going to use the

mosquito mqtt message broker

node-red for the data bridge we're going

to store the data in the influx db

database

and we're going to chart it using

grafana we also need a computer or

hardware to run all of this on

there are many different ways you could

set this up this could be across

multiple machines it could be hosted in

the cloud

but for this example i'm going to go

simple we're just going to use a

raspberry pi

and install everything on the same

device so to get started

set up a raspberry pi with a basic

raspbian installation

and then we're going to take it from

there and install all the different

components that we need to set up the

data logging system

raspbian has recently been renamed

raspberry pi os

and it now has this really handy utility


the raspberry pi imager

all you have to do is put an sd card in

your computer run the imager

and select which operating system you

want you don't even need to download it

first because the imager does it for you

then select the sd card click right and

then just wait 10 or 15 minutes

depending on the size of your card and

the speed

you can then take the sd card out of

your computer put it into the raspberry

pi

power it up and you're ready to go there

is excellent documentation on the

raspberry pi site to get you going

so follow the instructions until you've

got your raspberry pi connected to the

network

and you've got yourself a shell i'm

going to rush through these next few

instructions fairly quickly

it's hard to follow on video anyway but

on the super house page for this episode

there will be all of the commands you

can just copy and paste them

it makes it really easy the very first

thing to do is check what ip address has

been assigned to your raspberry pi

you'll need this latest so type ifconfig


have a look for the ip address and make

a note of it

with the raspberry os set up we need to

install an mqtt broker

and for that we're going to use mosquito

use apps to install both the mosquito

broker and the mosquito client

the client will be really useful later

when we want to do some testing

you can leave your broker with the

default installation with no username

and password

but let's set one up just to make the

system a little bit more secure

we'll start by putting a username and

password into a text file

and then we'll run the utility that

encrypts it

what this does is take the plain text

file and convert it into an encrypted

version

then we need to move it into the correct

place in the configuration directory for

mosquito

we also have to edit the mosquito

configuration

we'll disable anonymous access this

forces use of the password file

then we'll tell mosquito where it can


find the file

finally we need to restart mosquito so

that it will load our new configuration

we can check whether this worked by

using the mosquito client to try to

connect to the broker

i've used the minus v flag for verbose

and the minus t

flag for the topic i'm just using the

hash wildcard as the topic name so we're

listening to everything

if the connection is rejected it means a

password is now required

and our configuration changes have

worked so then we can try connecting

with the username and password that we

set

and the connection should succeed you

won't see anything because nothing is

being published to the broker right now

when the data comes in we need somewhere

to store it we're going to put it in a

time series database

so we'll use influx db to install influx

db

we can use its official repository where

the developers have provided packages

specifically for different operating

systems on the raspberry pi

start by fetching the official


repository key and adding it to the

local

key ring now you can add the repository

there are a few different versions

available so you need to copy and paste

the command that matches your operating

system

now that the repository has been added

we need to update the list of packages

that are available

just do sudo apt update and then sudo

apt install

in flux db this will pull down the

influx db package

and install it with the default

configuration

next we'll tell the system controller

service manager to enable influx db at

startup

start in fluxdb manually this time in

future it will be started automatically

whenever your raspberry pi boots up

let's set up access control for influx

db before we do anything else

the default installation of influxdb

leaves the system wide open

so we'll start by creating an admin user

and setting a password

connect to influx db by running the


client we don't need to use a username

or password this time because nothing

has been set yet

create a user called admin and put in

the password you want to use for it

now you can exit out of influx db simply

type exit

and press enter the influx db

configuration needs to be edited

so that it will use authentication

otherwise the admin user that we just

created will be ignored

use a text editor to open the influxdb

config file

search for the section called http and

then paste in these four lines

you can copy them from the super house

site

save your changes and exit

the config change won't be applied until

influx db has been restarted

so restart it manually from now on

anytime you want to connect to the

influxdb command line you will need to

supply the username and password

we need to do that now so connect like

this but use the password you just set

if the previous changes worked you

should now be connected to influx db

again
and authenticated as the admin user that

you just created

next we need to tell influx db to create

a database where we can store sensor

data

in this example i've simply called the

database sensors and that's all there is

to it

because of the way influx db works

there's no need to create a schema with

tables and columns

like you would with a relational

database such as mariadb

mysql postgres or sql server all you

need to do is create an empty database

and it will be automatically populated

when you start sending data to it

leave the influxdb client by typing exit

as usual

to take the data that arrives via mqtt

and put it into the database

we need some kind of a bridge we're

going to use node-red for that

there are several different ways to

install node-red and it's readily

available in the raspberry os packaging

system

however the node-red team recommended

that you do not use the packaged version


instead they provide a handy script that

installs the latest official released

version of node-red

and helps you keep it updated before

running the node-red installation script

install the tools needed by npm to build

binary modules

now you can run the official node-red

installation script just paste it on the

command line

the script will ask you whether you're

sure you want to proceed and whether to

install pi specific nodes

just say yes to both questions running

the installer takes a few minutes so i'm

fast forwarding through this section now

just be patient and let it work

when the script is finished node red

will be installed but just like with

influx db you need to configure it to be

automatically started on boot

type sudo system control enable node red

dot

service you can start the service

manually this time but in future it will

happen automatically when your pi starts

up

and finally once we've got data in the

database we need to be able to display

it
that's where grafana comes in just like

within fluxdb

we can install grafana by adding the

official repository and installing the

package

start by fetching the public key for the

repository and adding it to the local

key ring

now add the repository itself update the

package list

again and install the grafana package

just like the other packages we've

installed we need to enable the service

using system control so that it will

start automatically

that takes care of starting grafana at

boot for now let's start it manually

we've just spent an awfully long time

looking at consoles copying and pasting

commands

but from here on pretty much everything

is graphical we're going to be able to

use a web browser to configure those

different software components and make

them talk to each other

but before we do we need to get some

device that's going to send data to it

so that we can see what's going on so

what we're going to do is set up the air


quality sensor

and configure it to publish to mqtt from

that point

we can do everything we want to do

through a web browser

at the start of this video i said that

devices typically use one of two methods

to report values to mqtt

they can either report specific values

just as they are directly to topics

or they can combine values into a big

json string

and then report that so that it can be

separated at the other end

the firmware for the air quality sensor

project incorporates both methods

so it's a really good one to understand

how this works let's have a look

what you see here in the configuration

file are options for

separate reporting and json reporting

i've got them both turned on so we can

see both formats

and if we look in the source code i've

already been through this in the past in

a different video but

let's just have a look through so if we

can skim down here through the source

code we can see that

what we're going to be doing is setting


up topics for the individual values

and then we also set up a topic for the

json response

now what that means is we have the

option of either subscribing to each of

the topics individually

and each one will just have a number in

it which is the value for that topic

or we can subscribe to the json topic

and then we'll get all the values

combined into a single json object

and then we can decode that if we jump

over onto the terminal on our broker we

can use mosquito sub

to subscribe but we need to set a

username

because we've configured that in our

configuration file and we can't get on

without it

and we need to pass the password as well

if i can type this correctly i'll get it

there eventually

and i'm going to set the minus v flag

for verbose so that we can see the topic

that is being published to

not just the value now i'm going to

subscribe

to the wildcard topic here so that we

can see everything


and this way we'll be able to see every

single message that is published to this

broker

we'll see the topic that it's been

published to and the value

and that way we can see what's coming

back from the air quality sensor

so we'll turn on the power and see what

happens first up we can see that the

device is published to the events topic

the first thing here is the topic name

and then

everything that follows it is a message

that's been published

and in this particular firmware i've

just got it reporting that startup

events to

the events topic that way we can watch

the events topic and see when devices

join the network

and look now you can see that it's

published some values you can see that

it's tele for telemetry

there is the unique value for the id of

this particular device

and then ae1p0 so that's the atmospheric

environment particles

1.0 and there is a count of one

you can see here that there are three

topics that have been published too


and they have just a numeric value in

them

and then you can see there is another

topic which ends in sensor

so this is the exact same information

it's just that instead of publishing the

values directly to individual topics

it's wrapping it all up in a big chunk

of json and reporting it as a single

thing

on one topic so now we know that our air

quality sensor is reporting those values

to the broker

anything that needs to get access to

that information can subscribe to those

topics

and it will see whenever updates come

from the sensor the same principle

applies for all sorts of sensors it

doesn't have to be just air quality

sensors

this could be temperature sensors or it

really doesn't matter

anything that publishes values to the

broker we can now access

so the next step is to look into

node-red and how we're going to access

the information that's coming in on mqtt

from here on we're doing everything in a


web browser so open a new browser window

and go to the ip address

of your raspberry pi whatever device is

running node red

then put colon 1880 on the end that's

the port number that it uses

when you load that page it'll start off

totally blank like this with the default

setup

there will be a single flow called flow

one there are some nodes down the left

here

there is nothing set up here at all the

very first thing we need to do

is add support for influx db come up to

the little hamburger menu up in the top

right

go to manage palette and we're going to

go to install and search for influx db

and the very first result here is

node-red contrib influx db

that's what we want click on install

and install and away we go

this will add support for influx db to

your node-red installation

and then we can add nodes for storing

data into the database

and getting it back out if we want to

it's now added

a few nodes to the palette so we can


close this

we're back at the flow which is still

blank

and we want to start pulling in some

data on the left here where we've got

these nodes we'll scroll down

and let's start with mqtt in

we'll grab this one because what we want

to do is subscribe to

the mqtt topic that is publishing some

data

now double click on that node to set the

configuration

and you can see here it asks us to add a

new mqtt broker

that's because we haven't actually

connected this installation to the

broker at all yet when you are setting

up mqtt in node-red

there are two different things that you

need to configure in the node

one is the connection to the broker and

the second is the subscription to the

specific topic

now once you've configured a broker you

can use it multiple times you only need

to do that once

so let's do that here and then we'll

have it available forever


click on the little pencil icon next to

add new mqtt

broker and i'm just going to give it

localhost as the server name let's just

call it mqtt broker for now

if you have multiple brokers you can put

those in here and because i'm running

everything on the same device we can

just say localhost

if you're running a separate mqtt broker

somewhere else on your network or using

an external broker

you can put the ip address or the host

name in here and we'll just leave

everything else the same so

auto generate a client id leave

use clean session turned on keep alive

60 seconds that's all fine

click on add and now we've got mqtt

broker in our list

if we go to the drop down you can see

that there is just that single mqtt

broker

or we can add a new one we don't need a

new one we just want to subscribe to our

topic next

so if we flip to our terminal that we

were looking at a moment ago

you can see that there are these topics

with the values coming in


so how about we grab this one this

is a topic that we want to subscribe to

so i'm going to

put this in here and we are going to

give this a name

of ppd 0p3

and hit done and this will now connect

to mqtt

and subscribe to that topic it's not

going to do anything yet we could

actually deploy this but there's no

point

because the output of this subscription

is not being used

let's just add a debug node and see

what's going on

we'll put a debug node in here connect

them together

click on deploy and this will take these

changes

and make them live you can see here it

says disconnected

and that's because we didn't set up any

authentication

so let's go back into here and we will

edit this mqtt

broker go into the security tab and add

the username

mqtt username now of course this should


actually be a secure username and

password

but in our case i'm just using this as

an example

and done and now it says connecting

and in a moment we should see that it

will connect successfully

now let's just click on deploy oh here

we go see it says

connected so now we know that it is

connected to the broker

and it's authenticated now if we come

over to the debug messages

oh look at this we've already got a

message that's come in so this little

debugging note here which is outputting

information to the console

has said that on tele3cf032

slash ppd 0p3 the payload was

147. now if we switch back to our

console

we should see that the last message was

147.

so our node-red system is now

successfully receiving the data

that is coming in via mqtt and we can do

something useful with it if we want to

so at this point what you could do is

use that logic to set off some

automation
you could create a node for example

which looks at the value coming in

and if it's above a certain threshold it

sends off a command to start an air

purifier

or you know whatever you want to do set

out a notification

so what this gives us is the ability to

take

actions based on the data that is coming

in from the air quality sensor

or the temperature sensor or whatever

else you have connected

so instead of just outputting this

information to the debug

let's have a look at storing this into

influx db

if we scroll down here on the left this

is the list of all the nodes that are

available

we will find that there should be an

influx db node down here now

and here we go in storage there's influx

db in

in flux db out in flex db batch so now

we can make a connection to our influx

db database

so we'll grab this influx db out

so we'll double click on influx db and


of course we don't have an influx db

connection

so we're going to need to create a new

one

we'll just come in here the database is

already set it's just called database

but no we want to rename that to

sensors because that was the name of the

database that we created

we set up a username of just admin and

the password was

admin password of course you would use

something a little bit more secure than

that

it's on localhost because we've got all

this running on the same machine so that

doesn't need to change

and i'll just click on add and now that

we have the connection to

our database setup what we can do is say

what measurement we are wanting to store

now in this case what i'm going to do is

just name it the same as the

topic and that should be in my history

so i'll just paste that in there

and you can see that i've just named it

teller 3cf etc pbd0p3

and what'll i call this ppd so parts per

deciliter

0.3 we'll give that the name


and done so what we can do now

is just link from the mqtt subscription

we'll click on deploy and now any

data that comes in here will be stored

with a timestamp

into influx db and it really is as

simple as that

what it comes down to is really just

these two nodes

the subscription using mqtt the data

comes out here

and it goes into the connection to

influx db

because we're just dealing with single

values it's really easy

the data is being published as a number

we just take the number

send it to the database and it stores it

for us we don't need to do any other

processing

it's just a link from one to the other

now that we have this one piece of data

coming in from this mqtt topic

we can just duplicate this for as many

topics as we need

you can just copy those put these in

here this is the quickest way to do it

let's just edit this and we'll let's

should pick the


0.5 micron parts per deciliter

i'll just change the topic that it's

subscribed to change the name

and we want to store it into a different

location so let's change the measurement

to match the topic name

click deploy and now any data that is

coming in on those two topics will be

stored separately in the database

because we've specified a different

measurement in influx for storing the

data

right now though we're just trusting

that this is being stored properly in

influx db

you can check if you like if you switch

over to the console on the machine

running influx

then you can just use the influx command

line client we need to specify a

username

because we set one up when we were

creating this config earlier

and the password and i just made my

admin password in the example

so now we're connected to the influx db

database and we're storing everything in

the database called sensors

so just say use sensors and that will

select the correct database


now if we just ask the database what

measurements it knows about

this will tell us whether we're getting

any data going in and look we can see

that we've got those two topics

those are being stored in influx db so

if we want to see what data has been

stored coming in from this particular

sensor

we can say select star from

and then specify the measurement name so

it's going to be teller in this case

3cf032

ppd 0p3 which is a bit of a mouthful

and we'll put a limit of say 5 records

on it

so there we can see time stamps with the

last five records

and that proves basically that we have

now got the data coming from

the sensor published to mqtt it's being

processed by node red

and it's being pushed into influx so

that data is being stored for a future

use

but this is still just with those single

data points where we are getting a

simple number coming in on each topic

what happens if we get a chunk of json


and it's got multiple values in it we

want to split it up and store it

separately

well it's not really much harder just

like with individual values we want to

have a subscription to mqtt so let's

just grab one of these for now

we'll put it in here and then change the

settings we'll open it up

and we'll make it subscribe to the

sensor topic

and i'll just change the label here as

well so we can see the difference

and what's going to come out of here is

a chunk of json

so let's grab a json pass up we'll drag

it in here

and we'll take the output from that

subscription into the parser

if you open that you'll see that what it

does is convert between a json string

and a node-red object that makes it

really easy to deal with

now i'm going to reuse this debugging

node up here

i'll just take the output from that

parser stick it into message payload

deploy and now what will happen is that

that json will come in

be converted to a node-red object and


then displayed in the debug

window because the sensor is only

publishing every two minutes we might

have to wait a little

while but we'll get a message popping up

here in just a moment

well there we go here in the debug

window you can see the message has come

in

we can click this little triangle to

spin down the object

and if we dig into it we can see all of

these values being reported

so these different elements within the

message payload

we can now access these and log them as

if they come in on individual topics

so that way we now have a single topic

subscription which is giving us access

to all of that information

because it's coming in as a single json

string there are a few different ways to

pull these individual values out

i'm going to do it using a change node

for now let's just move this up here get

it out of the way

move message payload up here and then

find the change

node here we go we'll grab a change node


we'll take the output into here

and then we're going to set a value here

so we want to set the message payload

to a specific part of this message

now if you look across to the right

you'll see these three little icons

the first icon is copy path so if we

wanted for example to take out the pb

0.3 just click on copy path and it's now

in your clipboard

so over here in the change we can go to

message

dot and then paste so we're going to be

setting the message payload to

message.payload.pms5003 pb3

and i'm going to give it a name so we

can see what it is pb0.3 will do

the result is that the output of this

change node

is going to be the payload that is just

the value we want

so now what we could do is grab another

one of these nodes

so this is one of the influx db nodes

and take that output store it into

influx db

and i'm going to change this one so i

will make this one

0p3 let's just add a j on the end so

that we know
which one it is this is the one from the

json source

so then we could duplicate that let's

duplicate these two

put them in just down below and this

time

let's grab out the 0.5 so i'll copy that

one

i'll put that in where it's going to be

stored 0.5

we'll store it into the 0.5 j location

and take that into there so what we can

do here

is just duplicate these for whatever

values we want

so for each of the items that's coming

in here in the message payload we can

store it individually into influx db

so if we deploy this now what will

happen is that next time this message

comes in

as a json message these values will be

split out

and stored into the database now using

the debug

node to find the path to those

particular parts of the message is a

really handy tip

that's something that i use quite a lot


so keep that in mind

so anytime that you're trying to find

content in a message

what you can do is stick a debug node in

there look at the result

you can trace down through the tree to

find the part you want

and then copy the path to it you can

then reference that part of the message

wherever you want to now let's clean

this up just so that you can see a

summary i'm going to get rid of these

nodes

i'll move these ones up a little bit and

i will get rid of the 0.3 nodes here

so what you have here are examples of

the two methods

the first method is for taking values

that come in just as a straight value on

a topic

you can stick it straight in the

database or if it comes in as json you

can convert

it extract the value you want and then

store it

they both achieve the same end result

but these two patterns can be used

wherever you want to store data coming

in on mqtt

so now you know how to store the data


it's being put into our influx tb

database we need to report on it

and for that we're going to use grafana

because we've already installed grafana

to access it all you need to do is go to

the same ip address

and go to port 3000 instead of port

1880.

it'll come up with a login screen and

the default username and password is

just admin

and admin

you can change this in the grafana

configuration but for now it's just the

default

the welcome screen gives you access to a

tutorial and it also gives you quick

access to the two things you need to do

to get started

to use grafana you need two things the

first is a connection to a data source

we have influx db so we're going to link

to that and the second is a dashboard

you can have multiple dashboards and

each dashboard can have different

widgets on it

so start by clicking on add your first

data source

and it will take you to this screen


there are many many different data

sources listed

we can just scroll down and select

influx db or you can search for it

there are a few options you'll have to

put in here we'll just leave name as

influx db

that will do influx ql the url we need

to put in as localhost

port 8086 so i'll just stick that in

there

leave those sections the same don't need

to change anything here unless you want

to customize it

and then we get down to the influx db

details

we named our database sensors so let's

put that in here

and we also gave influx a username and

password

we made it admin and admin password

that should of course be something a bit

more secure

so then scroll down and hit save and

test

and now it says data source is working

it's made a test connection to influx db

and we know that that's all sorted so

now we have a data source defined and

it's just called influx db so we can


reference that

from our dashboard now just click back

up on the icon to go back to the home

screen

and you'll see now it says that setting

up a data source is complete

we've done our first one so come across

the dashboards

click on create your first dashboard and

we now have a new dashboard

just click on add panel and this is the

first

widget that we are going to put into our

dashboard now i personally find this

particular part of grafana to be quite

confusing

the user interface has lots of

configuration options in different

locations

so finding what you're looking for can

be confusing but you don't really need

to put in too much to get a basic

working chart going

so we'll get started and just use the

minimum for now

now the first thing we're going to do is

come down to this bottom left area and

scroll you can see there is a section

here called query


this is where we define where it's going

to get its data from

there is a single query just called a by

default

and it says from default and select

measurement

let's go to select measurement and here

you can see that it's pre-populated the

list

these are the measurements that it's

detected when it's connected to our

influx db

so for our first chart let's just pick

the 0.3 parts per deciliter

and i'm going to take the j version this

is the one that's coming out of the json

string and you can see that now that

i've selected it the chart has already

become to be populated

it looks like a bit of a mess now and at

this resolution it doesn't really show

you very much

but we can customize the appearance so

let's come across to the right into this

panel area

i'm going to call this ppd so parts per

deciliter

and then we'll scroll down a little bit

and we've got some options here for how

it's displayed
if you spin down visualization you can

see it's got a few different varieties

here

graph is what's selected by default and

that's what we want but just click

around a few others and you'll see some

other interesting options you can do it

as a stat which is just the number

as a gauge as a table as text heat map

many different options here so what

we're going to do is leave it as a graph

i'll spin that back up so we can see

what's going on and then we've got some

display options

we'll leave this as lines leave line

width as one

but then we've got some other options

down here that allow us to change the

way this appears

if you look at this you'll see that it's

lots of little points each one of those

points is a sample

but i want to make this look more like a

smooth chart one of the keys to that

is what to do with null values so if we

come down to here as null

and change it to connected you can see

that the chart changes from a series

of points to a continuous line and


that's because what it does is instead

of having no

value in between the samples it links

the samples together

and makes it look like a nice continuous

chart like this

there are a bunch of other options in

here as well you can change things like

how intensely the uh the area under the

chart is filled in

what the line looks like lots and lots

of options

but this is certainly enough for now and

just to show that we can have multiple

parameters on the same chart let's

create another query if we come down

here we've got this first query called

a which is the 0.3 parts per deciliter

and we'll do plus query we now have one

called b

we're going to select a measurement and

in this case i'm going to take the 0.5

parts per deciliter once again

from the json version

and leave everything else the same and

so now we have two values being shown on

the chart

and if we come up here you can see that

there is a key it's got the color

and the name so now we have our values


being displayed on the chart

we just click on apply and now we're at

our dashboard

and we have this widget and you can see

the different values that were

being recorded at that particular time

now what you could do is just add more

panels up here

and then you can see different values so

you can create charts which combine

multiple parameters

and you can make individual charts for

different things you can also see some

options up here for the display

so we can say last 6 hours and i'm going

to change this to say

last 24 hours so it's a 24 hour period

and look at this we've got a really big

spike right here

now if you were watching my live stream

this morning as in the day that i'm

actually filming this video right now

it's a few hours after i did a live

stream where i sold it on the live

stream

and if we look at the chart right here

look at this time this was just after 10

o'clock when i turned on

the soldering iron it was quite near the


start of the live stream

and you can see that the ppd count has

gone through the roof

so the air quality sensor which is

sitting over in the corner a fair way

from my workbench

has immediately picked up the fact that

i turned on my soldering iron

and there are measurable particles in

the air as a result of that

that sort of thing is really interesting

to see in a chart like this

there are also some other options up

here in relation to refreshing you can

just click to manually refresh the

dashboard or you can come

up here and you can specify a refresh

frequency

so if we click 1m this chart will update

automatically every minute

that way you can create a dashboard with

multiple widgets on it

and the widgets will update

automatically as new data comes in you

will see the charts update over time

and you don't have to do it by clicking

the refresh button every time

and you can specify how frequently you

want that to happen so this is an

example of another view let's come up


here and say add a panel

and i'm going to make this one use the

same data source

i just want this as a different

representation of the same information

so i'm going to select the same data

source there at the moment

it looks the same as the other one but

i'm going to come over here to

visualization

and i'm going to change it to being a

gauge just for

a bit of an interesting different way of

seeing it and if we come up to field

we can see some options that change the

way it behaves

so let's set a threshold i'm going to

say that

in a thing that is over 300

it should go red and up to that it

should be green

and that's just a number that i made up

for the sake of this

but you can use this to customize the

way these charts are displayed

and you can have multiple thresholds so

i will just apply that

and now we have a chart and if you come

up to the top here where the title is


i didn't put a title on this one we can

click the side of it drag it across and

we can rearrange

our dashboard so now we can have this

fuel gauge style display which just

shows the current value

right next to a chart showing that same

parameter over

a 24 24-hour period as you've seen

grafana is very powerful and flexible

it gives you lots of options for

different ways of visualizing the data

what you can do is create different

dashboards for different purposes

for example perhaps you have an older

ipad or an android tablet attached to

the wall

and you want to display environmental

data what you could do is create a

dashboard specifically for that device

with just the data that you want to

display and then create different

dashboards for other purposes

now just before we finish up with

grafana there's one final little thing

we need to change

just to make this look a little bit

nicer now down under the bottom of this

chart it's got this key

and it has this auto-generated label


which isn't very nice

if you want to change any of the

parameters of these existing widgets

come up to the top

click the little spin down and select

edit

now we're back in the window that we

were in when we set this up in the first

place

so if we come down here and we can see

the a parameter there is an option here

for alias by

so i can say ppd 0.3

just change that as an alias and you'll

see that the label under the chart has

changed

so i'll come down to this other one

change this to ppd

0.5 and now the labels

here start to make sense so we'll click

apply

and that will bring us back to the

dashboard now don't click away

this is something that has caught me out

several times this isn't saved yet

so come up here to the top little disk

icon click on save dashboard

i'm going to call this one air quality

and save so that dashboard is now saved


and it will be available whenever we log

into grafana if you don't do that last

step

the dashboard will disappear when you

close the browser and you don't want

that

if you're an experienced grafana user

you may be wondering why i didn't use

telegraph in this project

telegraph is a front integrafana that

allows it to acquire data from different

types of sensors

it would take the place of node red and

possibly even mqtt

so what we could do is reduce our

software stack from four items down to

three

in some situations that might be the

right software stack so

it's definitely worth looking into but

the reason that i used node-red and mqtt

is that they are very useful in other

tasks that you might want to do in part

of your automation system

it's really common to set up

automation's logic and rules and data

transformations in node-red

so having it as part of your software

stack is very useful

this video has been pretty long and it's


been quite painful setting up all of

these different items

but now that they're done you've got a

platform that you can use

to take data from all sorts of different

places and make decisions on it

not just log it and report it so what

you can do is build dashboards for

different purposes

and i really want to see the sorts of

things that you come up with

so if you build some kind of a custom

dashboard or you set up some kind of a

data logging project

please come along to the super house

discord server and share what you've

done

i'd love to see it now go and make

something awesome

you

You might also like