5 May 2008

Since getting my meter online, it has been sending its readings to a server in the sky so charts like these can be produced. To really play with the data, I needed to start logging it locally and producing my own graphs. Here is a rambling rundown of how I got from XML being spat out by the device every 6 seconds to something like this:

Step One – Parsing the data

This was the easy part; AndySC had already put together a perl script for reading the serial port and doing the necessary parsing.

Step Two – Publishing the data

Again, not much for me to do here as Andy’s script already publishes the data to a set of topics over MQTT.

Step Three – Logging the data

Finally, something for me to do. A couple years ago, I would have joined Andy’s perl script with one of my own, but python is more my thing these days. I already had a piece of python that subscribes to the appropriate topics and posts the temperature values to twitter. It didn’t take much to get the same script to subscribe to the power data and dump it into a MySQL database on the local machine.

+----------------+---------+
| when           | power |
+----------------+---------+
| 20080502231747 |    0.34 |
| 20080502231753 |    0.33 |
| 20080502231804 |    0.34 |
+----------------+---------+

The table in the database is a very simple one at the moment; logging the power along with a timestamp. An entry is added to the table for every reading from the meter and having been running for 3 days there are just over 11,000 of them. I’ll have to keep an eye on this to make sure it doesn’t run away with my free disk space.

Step Four – Graphing the data

Roo showed me some stuff he’s been playing with using the Google Chart API. Whilst I generally prefer to roll my own (aka, reinvent the wheel), I couldn’t ignore just how easy it is to produce pretty graphs this way.

Before delving into the API, I needed to decide just what I wanted to produce. As there is such a range of chart types available, there are plenty of interesting things that could be done. Initially, however, I decided to stick with the traditional “power-usage-in-the-last-24-hours” chart.

The API has a limit on the amount of data that can be passed to it. So I needed to find a meaningful way to reduce the 4200 data points generated in 24 hours to around 100 at most. I soon settled on using the average value for each 10 minute period. This loses some resolution in the data, but it still shows the trends.

Generating the averages is a simple question of the right query on the database. With some trial and error, I eventually got to:

select concat(substring(substring(`when`,1,11),9,4),'0'), truncate(avg(`power`),3) from currentcost where `when` > SUBDATE(NOW(), INTERVAL 1 DAY) group by substring(`when`,1,11) order by `when`;

Simple huh?

This produces results like this:

+------+-------+
| 2240 | 0.714 |
| 2250 | 0.637 |
| 2300 | 0.406 |
+------+-------+

With that in place, all it took was to throw it together into an appropriate URL for google to generate the image:

http://chart.apis.google.com/chart?chg=100,20,1,7&chxt=y,x&
chxl=0:|0|1|2|3|4|5| 1:|2100|2300|0100|0300|0500|0700|0900|1100|1300|1500|1700|1900& chxp=1,2,11,19,27,36,44,53,61,69,78,86,95&chs=400×200 &cht=lc&chds=0,5&chm=B,f3f3f3,0,0,0&chco=aaaaff& chd=t:0.348,0.348,0.319,0.311,1.285,0.683,0.338,0.349, 0.398,0.39,0.356,0.477,0.274,0.264,0.572,0.637,0.406, 0.382,0.368,0.324,0.325,0.917,0.322,0.358,0.342,0.318, 0.191,0.124,0.209,0.204,0.192,0.213,0.205,0.196,0.374, 1.753,1.537,0.976,0.552,0.556,0.513,0.433,0.363,0.341, 0.575,0.46,0.188,0.123,0.248,0.204,0.195,0.116,0.215, 0.207,0.226,1.574,0.636,0.214,0.206,0.209,0.204,0.648, 0.694,0.628,0.599,0.689,0.743,1.222,0.278,0.394,0.326, 0.394,0.225,0.184,0.13,0.145,0.23,0.226,0.222,0.133, 0.133,0.237,0.231,0.223,0.13,0.234,0.229,0.222,0.203, 0.197,0.193,0.17,0.198,0.176,0.275,0.328,0.287,0.212, 0.159,0.192,0.19,0.212,0.292,0.298,0.418,0.332,0.789,0.894,0.719

Again, simple huh? Well, maybe not so much. Here’s a break down of what that does (and just to confuse matters, this is in a different order to where things appear in the url above…)

http://chart.apis.google.com/chart?

This is base url of the Google Charts api – all the magic comes from here.

chs=400×200

Set the size of the image.

&cht=lc

Set the type of chart – a line chart.

&chg=100,20,1,7

Gives the chart a grid in the background.

&chxt=y,x

&chxl=0:|0|1|2|3|4|5| 1:|2100|2300|0100|0300|0500|0700|0900|1100|1300|1500|1700|1900

&chxp=1,2,11,19,27,36,44,53,61,69,78,86,95

Describes the axis labels. 0 to 5 on the y-axis, and the relevant times along the x-axis. The script figures out approximate positions along the axis for the labels.

&chds=0,5

Specifies the minimum and maximum values for the data – although I occasionally go over 5Kw, I decided it wasn’t worth squeezing the data for 99% of the time when it is below that.

&chm=B,f3f3f3,0,0,0

Fills the area under the line with a light gray.

&chco=aaaaff

Draws the line with a light blue.

&chd=t:0.348,0.348,0.319,0.311,1.285,0.683,0.338,0.349, 0.398,0.39,0.356,0.477,0.274,0.264,0.572,0.637,0.406, 0.382,0.368,0.324,0.325,0.917,0.322,0.358,0.342,0.318, 0.191,0.124,0.209,0.204,0.192,0.213,0.205,0.196,0.374, 1.753,1.537,0.976,0.552,0.556,0.513,0.433,0.363,0.341, 0.575,0.46,0.188,0.123,0.248,0.204,0.195,0.116,0.215, 0.207,0.226,1.574,0.636,0.214,0.206,0.209,0.204,0.648, 0.694,0.628,0.599,0.689,0.743,1.222,0.278,0.394,0.326, 0.394,0.225,0.184,0.13,0.145,0.23,0.226,0.222,0.133, 0.133,0.237,0.231,0.223,0.13,0.234,0.229,0.222,0.203, 0.197,0.193,0.17,0.198,0.176,0.275,0.328,0.287,0.212, 0.159,0.192,0.19,0.212,0.292,0.298,0.418,0.332,0.789,0.894,0.719

Specifies the data points.

What next?

  • Chris has started doing some interesting data analysis to see if he can automatically spot ‘events’ on the graph. Will be interesting to see what can be achieved here.
  • Andy’s twittering house got some linkage last week from both Earth2Tech and Wired Science. They mention the power orb that was written about last year. I really like the idea of an ambient device for displaying this information – another project for the arduino list.
  1. Chris HodginsMay 5, 2008

    Really nice solution! I also ran into the URL API limit but as you so neatly show you can still get some really nice results using data averaged over 10 minutes.

    My data analysis is still pretty basic and works well for items with distinct characteristics. More thinking is required to separate the indistinct items as well.

    Great job!
    Chris

  2. Steve Crossan • May 6, 2008

    Nice piece. As an FYI the restriction on calls to the Google Chart API has now been lifted. I like your solution for reducing the data points though.
    Steve

  3. pingback from Breakdown of currentcost xml output « CumbersMay 7, 2008

  4. Andrew Carr • May 12, 2008

    Hi,
    I’ve been following your blog post (and Roo’s) about using the CurrentCost meter, and I’ve managed to stag one from the eco gadget shop.

    I’d really like to do some fancy graphing with the Google Charts API; is it really cheeky to ask for you to provide me the perl script that AndySC put together to pull the data off the unit?

    Cheers,
    Andy

  5. trackback from Roo Reynolds - What's Next?May 13, 2008

  6. nickMay 13, 2008

    Hi Andy,

    I’m not able to share the exact perl script I use as it does contain some proprietary code that enables us to share the data we’re producing. I can however give you some simple pointers for rolling your own.

    The basic code is as follows:

    $serial_port = "/dev/ttyUSB0";
    open (SERIAL,"+>$serial_port") || die "can't open $serial_port";
    while () {
       $xml = $_;
       if ($xml =~ /<ch1><watts>(.*?)<\/watts>/) {
          # $1 is the wattage value
       }
    }

    The value for $serial_port will depend on your setup. This code assumes you are on a linux box, and are connected via USB. If you are connecting over serial directly, it is more likely to be something like /dev/ttyS0.

    I hope this helps you get started with charting your data.

  7. pingback from dale lane » Blog Archive » CurrentCost - first impressionsMay 15, 2008

  8. ChrisMay 17, 2008

    Just got a currentcost box, wiring it up now. Has anyone documented the pinout of the RJ45 connection?

    Also any idea whether it uses CTS/RTS or maybe some other flow control like XON/XOFF…?

    thanks

  9. ChrisMay 17, 2008

    OK, I managed to connect it to my mac with Roo’s help: http://flickr.com/photos/pixelfrenzy/2499443217/

  10. nickMay 17, 2008

    Hi Chris, glad to hear you managed to get it connected. Look forward to seeing what you do with it next.

  11. pingback from plus six » Counting the CostMay 18, 2008

  12. pingback from CurrentCost « log.illsley.orgMay 26, 2008

  13. Dale LaneMay 27, 2008

    RE: I’ll have to keep an eye on this to make sure it doesn’t run away with my free disk space.

    I’m using a similar MySQL DB structure to you (naturally, since I nicked your scripts to get me started ;-) ), and have been checking the size it takes.

    On my NSLU-2, each update requires 9 bytes.

    A few rough back-of-the-envelope calculations:

    Assume we store an update every 6 seconds – each needing 9 bytes of storage.

    1 day == 24 hours
    == 24 * 60 minutes
    == 24 * 60 * 60 seconds
    == (24 * 60 * 60) / 6 updates

    So that is 14400 updates in a day – requiring 129600 bytes of storage

    Multiply that by 365, and we need approx. 45 MB (47304000 bytes) for a full year’s data.

    Think it’ll be a while before this starts being a problem :-)

  14. pingback from dale lane » Blog Archive » Accessing MySQL from Perl on SlugOSMay 27, 2008

  15. abu21 • June 4, 2008

    hey,
    Chk this out visifire, i think it can improve the quality of your charts coz it is powered by silverlight

  16. nickJune 4, 2008

    abu21, thanks for the comment.

    I would question how a completely proprietary technology that only works on Windows would improve the quality of these graphs.

  17. abu21 • June 5, 2008

    hey nick,
    visifire is under GPL, what else can one ask for.
    regarding silverlight, microsoft is dieing to make it reach every one. currently silverlight supports win & mac. linux will follow soon ( some thing called moonlight)

  18. nickJune 5, 2008

    @abu21,

    True, Moonlight is being worked on by the guys at Novell. Once they get something released that I can easily install, then I may take a look.

    Certainly the fact they plan for a proper SDK on linux will be an advantage over Flash.

    In the mean time, I’ll stick to my open standards.

  19. andypJune 5, 2008

    I have to say that the fact that Google Charts are basically just spitting out PNGs for display makes them far more flexible across a range of devices anyway. Opera Mini on my BlackBerry can cope with that, for instance :-)

  20. pingback from Effing the Ineffable » Interfacing the CurrentCost meter to your PCJune 15, 2008

  21. pingback from dale lane » Blog Archive » CurrentCost - getting the history into WindowsJune 15, 2008

  22. trackback from Graham White: My NotesJuly 29, 2008

  23. Christy Simmons • August 1, 2008

    Any way we can submit or suggest a great graphing tool to you? Please email me for more info..i have something that you might want to blog about!

  24. nickAugust 1, 2008

    @Christy – if you have something to suggest, feel free to post details here. Or you can send me the details directly to nol @ this site.

  25. dollarone • September 6, 2008

    Andrew Carr and others who have looked for a script to read the Current Cost data:

    #!/usr/bin/perl
    # Based on logger.pl by Bruce S. Garlock 2002-09-11
    # (see http://aplawrence.com/Unix/logger.html)
    # Requirements: Device::SerialPort 0.12 (from cpan)

    use Device::SerialPort 0.12;

    $LOGDIR = “~/log”; # path to data file
    $LOGFILE = “currentcost.log”; # file name to output to
    $PORT = “/dev/ttyS0″; # port to watch

    # Serial Settings
    $ob = Device::SerialPort->new ($PORT) || die “Can’t Open $PORT: $!”;
    $ob->baudrate(2400) || die “failed setting baudrate”; # set to 9600 if you have the newer version
    $ob->parity(“none”) || die “failed setting parity”;
    $ob->databits(8) || die “failed setting databits”;
    $ob->handshake(“none”) || die “failed setting handshake”;
    $ob->write_settings || die “no settings”;

    # open the logfile, and Port
    open(LOG,”>>${LOGDIR}/${LOGFILE}”) || die “can’t open smdr file $LOGDIR/$LOGFILE for append: $SUB $!\n”;
    open(DEV, “<$PORT”) || die “Cannot open $PORT: $_”;

    select(LOG), $| = 1; # set nonbufferd mode

    # Loop forver, logging data to the log file
    while($_ = ){ # print input device to file
    print LOG $_;
    print STDOUT $_; # optional logging to STDOUT
    }
    undef $ob;

  26. pingback from knolleary » Blog Archive » CurrentCostuinoNovember 19, 2008

  27. pingback from dale lane » Blog Archive » HomeCampNovember 30, 2008

  28. pingback from Monitoring Your Power Usage « Isotoma BlogNovember 4, 2009

  29. Ian MacDonald • November 19, 2009

    Been playing with this for almost 2 weeks now. Pushed the data into a MySQL table slightly differently to you though as I set time as a timestamp so its more flexible. I can then convert to unix time and do time calcs that way.

    For example:

    +———————+——-+——+
    | time | power | temp |
    +———————+——-+——+
    | 2009-11-06 23:19:24 | 1493 | 22.9 |
    | 2009-11-06 23:19:30 | 1492 | 22.9 |
    | 2009-11-06 23:19:36 | 1512 | 22.9 |
    | 2009-11-06 23:19:42 | 1514 | 22.9 |
    | 2009-11-06 23:19:49 | 1506 | 22.9 |
    | 2009-11-06 23:19:54 | 1498 | 22.9 |
    | 2009-11-06 23:20:00 | 1491 | 22.9 |
    | 2009-11-06 23:20:07 | 1487 | 22.9 |
    | 2009-11-06 23:20:13 | 1486 | 22.9 |
    | 2009-11-06 23:20:19 | 1488 | 22.9 |
    +———————+——-+——+
    10 rows in set (0.00 sec)

    Then I can pull out data like this:

    select time ,truncate(avg(power),1) as power, truncate(avg(temp),1) as temp from power where time > SUBDATE(NOW(), INTERVAL 1 WEEK) group by floor(unix_timestamp(time)/3600);
    +———————+——–+——+
    | time | power | temp |
    +———————+——–+——+
    | 2009-11-12 10:59:18 | 960.8 | 22.8 |
    | 2009-11-12 11:00:00 | 980.7 | 22.7 |
    | 2009-11-12 12:00:02 | 991.5 | 22.7 |
    …..
    | 2009-11-19 08:00:00 | 996.5 | 23.2 |
    | 2009-11-19 09:00:05 | 973.0 | 22.9 |
    | 2009-11-19 10:00:05 | 974.7 | 22.8 |
    +———————+——–+——+

    You can convert your YYYYMMDDhhmmss to mysql time with something like:

    select str_to_date(’20080502231747′, ‘%Y%m%d%H%i%s’);
    +———————————————–+
    | str_to_date(’20080502231747′, ‘%Y%m%d%H%i%s’) |
    +———————————————–+
    | 2008-05-02 23:17:47 |
    +———————————————–+

    so you could convert that to unixtime.

  30. pingback from Power Graphs (again) « knollearyDecember 26, 2010

  31. leave a comment

    You must be logged in to post a comment.