BEAM Day

This weekend I went to BEAM Day, a day of workshops for creating music and sound art machines. This day is part of this years events for the BEAM 2012 festival taking place  22-24 June at Brunel university. I can’t make the festival this year, so the BEAM day was a good chance to meet some people creating and experimenting with electronic music, sound art and performance controllers.

Alex Allmont created a collaborative workshop to build an electro-mechanical noise machine, the Polytherelegomuino. People were invited to build mechanisms out of LEGO that would in turn operate the controls of electronic synths. Each participent had a LEGO board to create a mechanism on with a turning shaft providing power, all driven from a central motor. The result was like a robotic synth knob-twiddling noise factory!

Lego synth knob twiddling machine at BEAM Day 2012

One of the Polytherelegomuino machines at BEAM Day 2012. I built the board to the right with the 'hut' on it, which housed a worm drive mechanism for slowly turning a frequency knob connected to the Arduino based synth.

Here’s a video of the final machine taken by Alex – WARNING: watch your speaker level, this is NOISY!

The synths used in the Polytherelegomuino are based on the Arduino synths of Mike Blow’s optical theremin instruments, the Theremuino and Energy Ball Theremuino which Mike was demonstrating on the day.

Mike Blow's Theremuino

Mike Blow's Theremuino at BEAM Day 2012.

 

Mike Blow's Energy Ball Theremuino

Mike Blow's Energy Ball Theremuino at BEAM Day 2012.

Codasign ran a workshop for controlling OSC compatible audio software, such as Max/MSP, with movement gestures via a Kinect. The Kinect to OSC interface they were using is built in Processing using OpenNI libraries, which allows a person’s movements to be tracked and mapped to OSC parameters. In the workshop they showed how a synth in Max/MSP could have parameters like pitch, number of harmonics and filter cutoff controlled by hand movements and standing position, all tracked through the Kinect.

Other workshops included Bruno Zamborlin’s mobile phone (accelerometer) based gesture recognition for controlling sound and Noisy Toys circuit bending workshop.

BV Open Studios Weekend 2011

Last weekend was the dorkbot interactive musical installation at the BV Open Studios Weekend held in the Bristol Hackspace and the kitchen at BV Studios.

The Theremin Style Music Controller was set up in the kitchen along with two installations made by other dorkbot members: Richard’s biscuit-tin rhythm copying drums and John’s Tilty music box. Other installations included the Dorkbot Pisano wheels and Aaron’s Monome style ball-bearing controller, which triggered Anton’s electro-mechanical glockenspiel, a coconut and John’s octophone and LED level meters.

Thanks to all the people who came and had fun playing with the installation!

Monome style ball-bearing controller and connected instruments

Dorkbot musical Pisano Wheels

Theremin Style Music Controller

I’ve built a music controller that senses hand movements, in a similar way to a Theremin, for an interactive musical installation dorkbot bristol is exhibiting at the BV Open Studios 2011 this October. I’m hoping that people will be able to have some fun playing with the sounds of some music sequences being synthesized on a Mac by moving their hands in front of it. It has two sensors which measure how far your hands are above the box so you can move your hands up and down to control different aspects of the synthesized sounds. There are also four touch sensors that change the sounds being played when you rest a finger on them.
The aim is that the player should be able to explore the sound with their hand movements, the hand movements do not create the music itself. In this way, it should not require any special musical skill so anyone can have a go and make nice sounds, unlike a musical instrument like the Theremin which requires skill to play a tune.
The front panel. There are two IR distance sensors for the left and right hands and four touch sensors.

I wanted the hardware build to be as quick and simple as possible so the sensors are mounted in a cardboard box. The unit plugs into a Mac running Reaktor and controls the music being produced.

Inside the box.

The sensors are connected to an Arduino Uno, which has some code to send the sensor data as serial data over USB to the Mac. On the Mac, the control and routing of the sensor data is handled by some code developed in Processing to send the sensor data as MIDI and OSC to Reaktor.

The electronics components used in this project are:

  • Arduino Uno
  • Seeed Twig I2C Touch Sensor Controller and 4 Sensors
  • Seeed Stem Base Shield
  • 2 Sharp 2Y0A21 Distance Sensors

Arduino Uno and Stem Base Shield.