After months of developing art that encourages others to interact, I decided to build a project that is all about me. The NARCISYSTEM is an attempt to strap to my body as many biometric sensors as I could reasonably acquire, and use their output to drive various visualizations in a venue. Its first and only presentation was at the March Mindshare event in Los Angeles.
At the heart of the belt is a Funnel IO Arduino clone from Sparkfun Electronics. This excellent piece of hardware has a built-in XBee socket, USB power, and a LiPO charge circuit built in. Its my board of choice for new designs. This Arduino sucked in all the sensor readings, and transmitted them via serial XBee to my Powerbook. I used an XBee Adapter Kit from Ladyada to interface XBee to the computer. Note, if you’re trying this at home, do not follow Ladyada’s instructions to program a Funnel IO over XBee (at least on OSX.) Instead, use the configurator tool that comes with Funnel IO.
The belt itself is a made of latex and was custom made for me by Devan and Isa of PodBelt and Psymbiote.
What sensors did I use?
- Heartrate monitor. In order to detect heartbeats, I used a Polar heart rate monitor strapped to my chest. I read this data with a Polar Heart Rate Monitor Interrface available from Sparkfun Electronics. This board has some fancy software to compute heartrates, but since I wanted an “heart beat edge trigger,” I soldered a wire to a probe hole near the LED and hooked it up to a digital input.
- EEG. To achieve something that looked like an EEG, I ripped apart a Ramsey Electronics Electrocardiogram Heart Monitor Kit. For EEG probes, I used the business end of my OCZ Neural Impulse Actuator. I then proceeded to add bias resistors and voltage dividers at random to the ECG kit (a process reminiscent of circuit bending) until the output signal looked compatible with my Arduino A/D converter. Technically this functioned as an EMG—not an EEG—because I didn’t get a clean enough signal to do frequency analysis. OCZ: if you are reading this article, take note! Because you have steadfastly refused to provide even the merest sliver of developer support I have resorted to chopping the custom plug off your probes and using them in a $40.00 kit. Next, I will use your interface hardware to prop up my desk. Your software I have no use for.
- Breathalyzer. I measured my breath alcohol concentration using a MQ-3 Alcohol Gas Sensor from SFE. This cute little sensor was strapped to my arm, so I had to remember to occasionally breathe on it. I was surprised at how well this sensor performed. Over the course of the NARCISYSTEM development, I tried on numerous occasions to quantify this sensor’s accuracy. But I was consistently foiled by a progressive breakdown of my scientific methods, followed by unconsciousness.
- Compass.I measured my bearing using a HMC 6352 compass module from SFE (are you detecting the pattern yet?) This I2C sensor is incredibly easy to interface and provides accurate data. This is the same sensor I used to build my Haptic Compass. In fact, it is the exact same sensor, delicately desoldered and haphardly replaced.
- Accelerometer.The accelerometer is a ADXL202 on a SFE breakout board. This is a tried-and-true analog accelerometer.
How did I visualize the data?
On the day of the event, I found myself with fewer “output modalities” than sensors. I showed up with eight LED Parcans (available from AmericanDJ at Guitar Center.) I had a subwoofer. I had a fog machine. I had an iPhone. I had a few missing cables.
I erected a lighting tree with the Parcans illuminating a “ring” around the venue. These lights would all flash red briefly every time my heart beat. To illuminate them, I used an ENTEC USB Pro adapter, using my Python DMX modules.
With the bearing information from the compass, I projected my bearing vector on the lights, such that they turned blue only in the direction in which I was facing. As I stood in the middle of the venue and turned in a circle, it appeared to me as though all the lights were blue. But an observer would see that the illumination was in fact tracking my direction.
I used the differentiated the accelerometer data (to get jerk) and used that to directly drive a subwoofer. I was actually driving the subwoofer constantly below the resonant frequency. Only when I took a step would the frequency jump into an audible range. The end effect was that the venue shook with my every step, as though a giant had stepped. It was barely perceptible (so I wouldn’t drive anyone crazy) but very effective. To drive the subwoofer, I sent OSC messages to the ChucK audio programming environment.
The breathalyzer was set to trigger the fog machine when my intoxication reached a threshold. However, the fog machine I borrowed was, um, excessively volcanic and was deemed unsuitable for indoor use. So at the last minute I set it up outside, and made it triggerable from my iPhone.
I was using the EEG/EMG data to alter the tempo of a synthetic music composition (also running in ChucK.) This worked unpredictably, even with tremendous filtering, so I eventually turned it off and removed the headband, so I looked like a complete dork only from the waist down.
Finally, since no contemporary project is complete without gratuitous use of an iPhone, I wrote a visualizer for all this data using the “custom control” feature of OSCemote and transmitted data to it via OSC. OSCemote is far and away my favorite app store application, for the simple reason that it has saved me from actually having to learn any ObjectiveC. I developed my custom visualizer in Dashcode, which was surprisingly fun and easy to use.