22 Apr 2010

It all started, as things do, with a tweet.

As part of the Emerging Technologies group at IBM Hursley, Kevin gets to play with new technologies to see how they might be useful to IBM’s customers. One such item is the Emotiv headset (an electroencephalograph if you must), which can read signals in the brain. You can train it to recognise particular thoughts which has some very interesting applications from gaming to rehabilitative care. You can find out more about the headset in this piece from The Times. But I digress.

The BBC were interested in finding out more about the headset and what sort of thing IBM had been doing with it. Knowing they were interested to see if a car could be controlled by the headset, Kevin was looking for something to make the demo more relevant, which led to his tweet.

With only a couple days to put something together, I suggested we go down the route of wiring up an existing radio controlled car to an Arduino. Kevin already had the headset hooked up to MQTT, so it would be trivial to use my arduino MQTT library to get them all talking.

A quick trip to Asda and I was the proud owner of a £9 blue Mini Cooper car, which I attacked with my soldering iron. It didn’t take much to get it working – I’ll blog the finer details of that bit soon.

The demo went well and we discussed more about what they wanted to do for the programme itself. Some of their ideas were ambitious to say the least. Someone mentioned the idea of driving a bus through Whitehall… not sure how serious that was. But ultimately, a straight race between two taxis ‘driven’ by the two presenters was decided on.

A couple weeks later, they were back in Hursley to film with Jem and Dallas. Now, there are some things that are best not left to the last minute. Such as realising they needed two radio controlled cars for filming – when I only had one. Luckily this dawned on me the day before they came down so I returned to Asda and got a shiny red sports car that would look good alongside the mini. I then discovered one of the reasons they were so cheap is that both worked on the same frequency… one remote drove both cars. With time running out, I went back and got a gaudy yellow jeep that was a completely different make and thankfully worked on a different frequency.

The cars

A couple of weeks later, Kevin and I headed up to a barn in middle-of-nowhere-Northamptonshire where Jem had been working on the taxis. Now, a few people have said to me “yeah, but he doesn’t really do the work does he?”, to which I have to reply that he very much does; Jem really knows what he is talking about when it comes to building things and the enthusiasm he portrays on screen is just what he’s like in real life.

Jem Stansfield

Over the course of two freezing days, we got the radio units hooked up to MQTT, again via an arduino. This was probably the piece I was most worried about – it was one thing to hack a toy remote control but it was going to be quite another to do the same to an industrial radio set that cost considerably more. Not to mention the fact that they were also on loan for the project, so breaking them would have not made me any friends.

In the workshop

We filmed the first test run and the relief was palpable when the car lurched forward thanks to Jem’s brain – not to mention the reaction when he managed to brake within a few inches of an oil drum. Although none of that made it into the final programme.

Mission Control

And then we had the main event – the race itself at the Santa Pod Raceway. 8am on a freezing December morning is not the best time to be trying to wire up the last few connections and try to debug why the damn thing wasn’t working. But somehow we got there and eventually the taxis did what they were thought to do – even if one did plow into the crash barrier at some considerable speed.

Dallas & Jem

The plan had been to do two races; a straight race and an obstacle course. Technical hiccups along the way meant it wasn’t until after lunch that we got the straight race filmed, at which point we were running out of light. It was decided to put Dallas in the back of a taxi and have Jem drive him around. This was the first proper test of steering by mind-control. Let’s just say I wouldn’t enter into a slalom race any time soon.

With all the filming done we packed up and headed home. Almost 5 months later, we got to see the end result on TV. Having spent the best part of 4 days filming, I was fascinated to see how they would edit it down to the 10 minutes or so they had to fill. I have to say I’m really please with the result. They may have given Kevin the speaking part out of the two of us, but I think I got more close ups. Given the target audience, I’m also not that surprised that they didn’t dwell on the finer details of the technology.

That said, I’m a proud geek that managed to get both my Ubuntu lanyard and an Arduino onto prime-time BBC One.

Arduino on BBC1

Me

Update: you can see the bits of the programme that featured the taxis here.

  1. Kevin BrownApril 22, 2010

    Excellent write-up… I’m still amazed how in that shed you managed to transform the few little components you bought into a working circuit board, seemingly making it up as you went – especially in such freezing cold conditions – most impressive :-)

  2. Dave NiceApril 22, 2010

    Cool to read an inside view of what it’s like! 4 days to 10 minutes… Wow!

  3. kybernetikos • April 23, 2010

    I’ve had an OpenEEG based system, and I’ve currently got the OCZ NIA. Both of them had lots of problems around calibration, etc, and neither really convinced me that they would be able to reliably and repeatably pick up control signals from my brain. I’ve played a bit of Nexuiz with the NIA, but I think there’s still too much of the random. What’s your experience with the Emotiv?

  4. Graham WhiteApril 23, 2010

    Finally caught up with the programme last night Nick, looked like great fun (most jealous I didn’t get in on the act too). Good write up as well, thanks!

  5. nickApril 23, 2010

    @kybernetikos I should start by saying, I am not an expert with EEG technologies – so consider the following as potentially ill-informed speculation on my part.

    The Emotiv set makes a point of being able to pick-up both brain activity (thoughts) and facial/muscular activity (blinking, smiling etc etc). This makes it good at being able to train it with emotional responses – typically expressed by a combination of both facial and mental activity.

    Now, as I understand it, the detectable level of electrical activity generated by the facial side of things is much larger than the activity generated by the brain – possible even orders of magnitude different. So it wasn’t clear to me whether an involuntary twitch or blink would swamp out the signal from the brain.

    For this particular experiment, the aim was to test the brain alone. This made it harder for Jem and Dallas to get to grips with the system in the limited time we had – as it was important for them to keep an absolutely straight face. When you’re training the headset, watching the floating cube on the screen, it is surprisingly difficult not to find yourself leaning forwards or backwards as you try to push and pull the cube around.

    The key to it was very much the training. There’s no doubt that it got more reliable the more trained it was – particular when you wanted to train it to recognise more the a couple thoughts.

  6. pingback from Bang went the theory… | eightbarApril 23, 2010

  7. pingback from Forget the Rovio, drive a taxi with your mind. - Hack a DayApril 29, 2010

  8. mowciusApril 29, 2010

    Arduino on prime time! Get in!

    Mowcius

  9. David Fletcher • July 28, 2010

    I saw your website through a work collegue here at the Johnson Space Center and thought I would make an inquiry as follows:

    When you interfaced your Emotiv headset to your PC what OS were you using? Emotiv comes with MS Windows-based drivers, but I’d like to know if anyone has developed drivers for a Linux-based OS? We’re planning on interfacing an Emotiv headset to an ISS (International Space Station) simulation and wondered if anyone out there has developed Emotiv headset drivers for the Linux operating system? I contacted Emotiv about this, but they said that they are not actively developing a Linux driver set at this time. So we may use Windows instead with a few more tweaks if we can’t find Linux drivers.

    So, if per chance you’ve developed them would you mind sharing them with us?

    Thanks,

    -David

    David Fletcher
    Human Cognitive Technology PL/PI
    Integrated Test Facilities Branch
    NASA/JSC/EA/EV3
    281-244-5136 (Office)
    713-208-8718 (Cell)

  10. nickJuly 28, 2010

    Hi David, that sounds like a fascinating project. I’m afraid to say we used Emotiv’s Windows drivers for this; I’m not aware of any work being done on Linux drivers.

    Cheers, N

  11. leave a comment

    You must be logged in to post a comment.