About a year ago I set up an air quality monitoring station based on the Air Quality Egg product by Wicked Device (see also my previous post on AQE). It was working during the winter season, providing real time data on PM2.5 concentrations near my house. This graph shows some of the data collected.
This worked OK, except that the data from the device was uploaded to Xively and available only from there. Xively was providing this service free as it was fulfilling an old commitment from one of their acquisitions. It was not high on their priority list and the service was frequently down. I decided to build my own device then to have full control over the process – from collecting the data to displaying them on a public web page. And to have some creative fun. The result works well (at least at the time of the writing) and cost less than $80 in materials, including a Raspberry Pi that I used. I think building a PM2.5 monitoring station in a way similar to what I describe below would make an excellent high school project.
Initially I was planning to use the Shinyei PPD 42NJ particle sensor as the heart of the device. This is what was used in the Air Quality Egg that I got (they use a different sensor now). The performance of Shinyei PPD 32NJ has been independently evaluated. However, this part has to be individually calibrated, which I tried to do by taking it to Wrocław (I work there) and comparing the readings from it to the official monitoring station. The PM2.5 values in the city at that time were oscillating between 20 and 45 µg/m³, which was not wide enough range to get a good correlation. Or maybe it simply didn’t work for some other reason. After a month of unsuccessful attempts, I decided to buy SDS011 sensor made by Nova Fitness Co., Ltd. It is more expensive ($25 on AliExpress rather than $10), but provides actual PM2.5 and PM10 readings in µg/m³ through its serial interface. The two units I ordered came with TTL to USB adapters. One of the adapters didn’t work, but I could easily replace it with another one I bought locally for about 2$. So, hardware-wise all one needs to do is to put the SDS011 sensor in a box with some holes and connect the USB side of the adapter (through some extension cable) to a standard USB port. Here is how it looks like in my case:
I would like to make it clear here that I am showing this picture not to demonstrate an example masterpiece of industrial design, but as evidence that if I can do it, everybody can.
The other end of that USB cable connects to a Raspberry Pi with software described below. I chose Raspberry Pi for ease of programming, but I guess Arduino is an option as well. The adapter shows up as /dev/ttyUSB0 device on Linux and transmits a 10-byte message once a second.
The software stack that runs on a Raspberry Pi consists of five KDB+ processes managed by Enterprise Components middleware. All relevant code and EC configuration is on GitHub. Out of that code probably the sds.c file might be interesting as it contains the TTY interface configuration that took me some time to figure out. That file does not contain any KDB+ related stuff, so it can be compiled into a library for getting data from SDS011 on Linux. There are a couple of other projects related to SDS011 on GitHub, so if you do your own setup don’t forget to do a search to find the most useful one.
Using an industrial strength time series database for collecting 1Hz data may seem like an overkill and that’s because it is. But, since KX released a Raspberry Pi build of their product I wanted to try it out with Enterprise Components to which I contributed some code. Everything worked pretty much out of the box. The only thing that I spent some more time on was building yak (the process manager for the Components). yak is written in Python and it is usually used in executable form made by bbfreeze. The instructions for building yak that can be found on its GitHub repository worked on Raspberry Pi, except that I had to install patchelf-wrapper as an additional dependency. All that KDB+/EC stack can be easily replaced by a simple Python script logging the data in CSV files and uploading the current values to the some publicly available place.
In my case the current data in JSON format are rsync’ed to the hosting service every 10 minutes. I used this guide to setup rsync so that providing password with every connection is not necessary. The result can currently be viewed here. The html file there is a simple web app written in Elm, but that’s just because I wanted to keep somewhat current with changes in Elm (they dropped signals – one of their core concepts with their 0.17 release in May last year). The same thing can be done by generating a static page every 10 minutes, or whatever frequency one wants to update the value with.