/Terra Chorus is a digital project that combines 3D modelling, real-time graphics, and sound interaction to create a responsive visual system. The work begins by digitising rocks into point clouds, providing a recognisable visual that can be animated and manipulated within a virtual environment. The point clouds are controlled through a MIDI controller, allowing sound and music to directly influence the users behaviour when interacting with the work. Colour, rotation, scale, and particle spread are mapped to inputs, enabling the rock to appear as if it is dancing in response to rhythm and tone. The rock can pulse in time with beats, and disperse into particles during moments of intensity, creating a dynamic connection between visual form and audio signal.
By connecting the tactile qualities of stone with the flexibility of point-cloud data, Terra Chorus demonstrates how natural forms can be transformed into responsive, performative entities. It highlights the potential of MIDI as a tool for bridging user, sound and digital matter, offering a connecting where visual and auditory systems are tightly linked with human behaviour.