OUR PROJECTS
[[bpstrwcotob]]
Musical Agent Systems: MACAT and MACataRT
Our research explores the development and application of musical agents, human-in-the-loop generative AI systems designed to support music performance and improvisation within co-creative spaces. We introduce MACAT and MACataRT, two distinct musical agent systems crafted to enhance interactive music-making between human musicians and AI. MACAT is optimized for agent-led performance, employing real-time synthesis and self-listening to shape its output autonomously, while MACataRT provides a flexible environment for collaborative improvisation through audio mosaicing and sequence-based learning. Both systems emphasize training on personalized, small datasets, fostering ethical and transparent AI engagement that respects artistic integrity. This research highlights how interactive, artist-centred generative AI can expand creative possibilities, empowering musicians to explore new forms of artistic expression in real-time, performance-driven and music improvisation contexts.
MASOM: Musical Agent based on Self-Organizing Maps
MASOM learns how to play music by listening to some.
Mova: Using Aesthetic Interaction to Interpret and View Movement Data
The Lab's movement visualisation tool
Walknet: Affective Movement Recognition and Generation
Machine learning models to recognizer the valence and arousal of movement information