top of page
emplogo.png

BLOG

Translating Music Into Motion: The Automated Harp Project

  • 7 hours ago
  • 2 min read
Automated Harp Project


Developing the automated harp project was a unique challenge that combined audio analysis, software development, and motion control into a single interactive installation. What began as an idea just a few weeks earlier quickly evolved into a complex process of translating music into precise mechanical movement.


At the heart of the project was a deceptively difficult problem: how do you take a digital audio file and convert it into physical motor movements that convincingly recreate the visual performance of a harp?


While the hardware side of the system was relatively straightforward, the real challenge lay in understanding and processing the audio itself. Unlike a traditional MIDI file, the source material was a standard audio recording containing multiple notes, harmonics, background noise, and overlapping frequencies. In order to recreate the piece visually, the audio first had to be analysed and simplified without losing the essence of the performance.


To achieve this, the music was processed through audio analysis software to identify the most prominent pitches within the recording. Because the harp system only featured 17 controllable strings, it was necessary to carefully select the notes that contributed most strongly to the recognisable character of the piece. This meant balancing what the software detected digitally against what could actually be perceived by the human ear.


During development, several iterations of the audio extraction process were required. Some frequencies appeared highly prominent in the analysis software but were barely noticeable when listening back, while others that felt important musically needed to be prioritised visually. By repeatedly refining the analysis and comparing modified versions of the track against the original recording, the final selection of 17 notes was established.


Once the key pitches had been identified, a custom audio file containing only those selected notes was created. From there, each note was mapped to an individual harp string and corresponding motor channel. The next stage involved converting the analysed musical data into machine-readable commands that the hardware could interpret.


Custom software was then written to translate the extracted note information into precise motor movements. Each string was assigned to its own actuator, allowing the system to recreate the timing and rhythm of the original performance.


Although the electronics themselves were relatively simple compared to the audio processing, the project still required the team to work with new hardware components and develop an understanding of unfamiliar motor driver chips. As with many experimental projects, a significant part of the process involved problem-solving and learning in real time.


One of the final and most important stages was timing calibration. Early tests showed that even when the motors triggered correctly, the visual result did not always feel convincing. There is a subtle but important difference between when a note is heard and when movement is visually perceived by the audience. Small timing offsets had to be introduced and repeatedly adjusted so that each string movement appeared perfectly synchronised with the music playback.


After extensive refinement, the final result successfully created the illusion of the harp physically performing the piece in real time. With the music playing alongside the motorised strings, the installation achieved a convincing and highly engaging visual performance — transforming digital audio into coordinated automated mechanical movement.

 
 
bottom of page