11 December 2013

2lemetry, Arduino, and Raspberry Pi. Oh, my.

The last 24 hours have been a whirlwind of hackery. Starting from a point of nearly no knowledge in the space of Arduino and maker hardware hacking, I've created... The Video Game Physical Excitement Archivist.

In short, this system monitors users engaging in video games and takes camera snapshots of their faces during moments of peak excitement.

In shorter, I take your picture when you yell.

This project initially began as an Arduino-based decibel monitor which I was going to leave in my home to capture events of unusual volume and report them to the 2lemetry platform. What a strange trip it's been.

Starting with that original idea, I sought out how to do the basics: use an Arduino to capture volume. I settled on the Arduino Uno and an electret microphone breakout board (thanks, Dia!). After a crash course on Arduino headers, breadboards, and sketch programming (thanks, John!), I had a light blinking in response to loud events in about ninety minutes. (Thanks to Sparkfun user Julian4 for some code assist!)

The next step was to add in an Arduino MQTT client and start sending these events to the 2lemetry platform, for archiving and real-time alerts. However, I skipped that step and tried to think of something interesting I could trigger in response to a loud noise. I'm not sure how I arrived where I did, but I was thinking of the previous night's session of Samurai Gunn, which had just been released. Kyle and I had an absolute BLAST with the game and I remembered we would frequently shout "OH!" when someone died. Lots of quick, exciting moments. I thought it would be amusing to capture these moments in pictures.

How was I to capture these images? There are several options for cameras on Arduino, but I wanted to involve another recent 2lemetry project which used a Raspberry Pi with attached camera to upload photos to the cloud. It also gave me another reason to include MQTT and the 2lemetry platform. Fortunately, we had a Pi sitting around already configured to use the camera and capture images with a Python script. In almost no time at all, I had my Pi subscribed to an MQTT topic and grabbing images from the camera on receiving a message.

Next up was adding an ethernet shield to the Arduino, dropping in the MQTT library, and firing an MQTT message on that topic to trigger the Pi. Totally easy. After adding Nick's MQTT library to the Arduino IDE, all I had to do was open an included example to get going with the ethernet and MQTT libraries.

Lastly, I wrote a two line HTML page on the Pi to display the most recent image captured on disk and set this to refresh on a 1s timer. Tomorrow, I'll likely add something better than timer refresh to get the images up on screen more directly.

The result is fabulous. The Arduino mic fires an event at loud exclamations, clapping, etc. The latency between the Arduino firing an event and the time the Pi takes a picture is imperceptible to the naked eye. It looks like the Pi is getting a local hardware interrupt and NOT an entire round trip message to the 2lemetry cloud.

I'll be deploying this system for the first time on Friday at the 2lemetry holiday party. There will be a PC running Samurai Gunn with four controllers to capture those "OH" moments. I can't wait to share the results on Saturday!

Pics of the hardware used-