Exploring Music Visualisation with Processing and Ableton Live
The aim of this project is to develop a fully functioning music/sound visualizer – written in the programming language Processing that produces satisfying, high quality, high resolution video imagery from tracks within Ableton Live 9.
A conventional music visualizer can only interpret a music track as a whole, and can’t effectively distinguish between notes and subtle differences without a plethora of false-positive triggers. On the contrary, linking a visualizer to Ableton achieves a very definite separation between different elements of a track which can then be monitored and visualized with minimal error. This music visualizer takes the form of a window on a screen, running from a Processing script which receives input from Ableton Live, a popular DAW (Digital Audio Workstation). When certain MIDI notes and MIDI control effects are triggered/changed, the visualizer reacts with different patterns, colours and movements. These messages are communicated between the two programs using the Open Sound Control (OSC) protocol. The overall idea for the visuals that are produced from the visualizer was to centre it around generative art. Generative art is the process of generating images from computer algorithms. John Whitney used polar co-ordinate equations to generate some of his visuals, these are some of the earliest productions of generative art using a computer. Aside from the traditional polar co-ordinate equations Whitney utilized, custom ones using a different set of similar equations were used in this project. A basic particle system was implemented, where particles can be seen emerging from the centre and changing colour. A set of fractals that use recursive algorithms were also implemented. The visualizer’s main framework is built on what can be referred to as ‘scenes’. These scenes are similar to scenes in a movie, where a set of actions or themes are occurring separately to other scenes. Depending on how many bars of music have elapsed, the scene will change on the ‘clap’ of every bar. This could be changed to be triggered by another musical event like a kick or snare, but in the project’s case it is the clap. A counter in the program keeps track of when the clap has been hit. When this counter reaches 8 claps, it resets and the process is repeated. A UI window was implemented to allow the user to change how frequently the scenes are changed. The project resulted in a fully functioning music visualizer that takes multiple MIDI inputs from Ableton Live and generates imagery and movement accordingly. This visualizer can be displayed on an external projector/TV without the viewer seeing what is being controlled on the laptop. It could be used in a live situation effectively with pleasing results.
The project resulted in a fully functioning music visualizer that takes multiple MIDI inputs from Ableton Live and generates imagery and movement accordingly. This visualizer can be displayed on an external projector/TV without the viewer seeing what is being controlled on the laptop.