What is the logical next step for a musician who creates electronic music? Why, building his own musical robots of course!
In 2008, bored of hearing the same sampled sounds, musician Jason Long began building his own analogue synthesizers and two years later embarked on the path to building his first musical robot. That first robotic creation was an automated Indonesian Gamelan instrument. Fast forward to 2014 and Jason has created more than 20 robotic instruments. Whilst this sounds like a lot, it is by no means an easy task to make instruments which in essence, play themselves.
The musician and Victoria University PHD student is currently part of the the Sonic Engineering Lab for Creative Technology (SELCT) which merges the Engineering and Music departments. He is the first to admit that it takes time to produce these instruments, so his satisfaction derives more-so from writing the compositions and holding concerts performed by his robotic ensemble. Performances and composing aside, in order to complete his vision he has to develop the technology for each new instrument.
According to Jason, some robots are easier to assemble than others. His castanet robots take approximately half an hour, whereas his robotic Taishogoto took a year to produce and many weeks of full-time work to see it through to completion. And back to the question of how many instruments he has made in total, he refers to a xylophone he roboticized that consists of 30 individual striking mechanisms, one for each key. “If the 30 mechanisms were instead attached to 30 different drums, that would be 30 instruments. It’s a bit tricky to quantify.”
Driven by an urgency to experiment with new sounds, Jason is a perfectionist who finds himself tinkering away until his robotic creations are replete with capabilities that will either match or exceed that of a mere human. But striking the right chord is a fine balance, and Jason says that every robotic instrument he has created thus far has exceeded human capabilities in some ways, but not in others. While the robots do what they are designed to do very competently, they ultimately lack flexibility. For instance, they are able to play at higher speeds with reliability and greater complexity than a human player, but will generally struggle to play with as high a range of timbre.
“If you ask a xylophone player to change from hard to soft mallets during a piece, that will be easy for them, but for a robot to do that, a relatively complex mechanism would have to be designed to achieve it.”
Jason reinforces that the appeal of robotic instruments lies in the different techniques that can be applied when composing, for example, every time a string is plucked or a membrane is struck, there will always be a natural variation in sound each time.
The makeup and aesthetics of Jason’s robotic ensemble has largely been inspired by instruments sourced throughout Asia, and in particular, Japan. He recently spent 3 years living in Tokyo where his daily commute took him through the electronics district Akihabara, a place to source parts and other electrical junk which had the potential to be reused and reinterpreted.
Jason’s big picture dream is to create a live electro-acoustic and robotic show which can tour internationally and be scaled to suit both small spaces or larger concert halls. In the meantime, he is roboticizing some recently acquired instruments sourced from an Indonesian market, whilst also recording one of his compositions and working on an installation which uses his robotic xylophone, glockenspiel and other percussion to actuate the sounds and music from a video game in real time.
And when asked whether the perfect musical instrument is entirely machine or technology driven, Jason says that without humans controlling the robots, an authenticity of emotion would simply be lost. “I’m making these instruments as tools to broaden the scope of my musical expression as a composer, rather than to replace composers altogether.”
Also check out this audio story I recorded with Jason and musician Mo Zareei: