Choreographer partners with Google Arts & Culture Lab on AI tool AISOMA
When a dancer with a 25-year career teamed up with Google’s Arts & Culture Lab, the goal wasn’t just to digitize old footage. He wanted the tech to actually talk back to his work, to hear the quirks of his style and suggest new moves for the stage. The idea was to stitch AI into the rehearsal loop so a static catalog could become a kind of living lab, each generated phrase could be tried, tweaked, or tossed on the spot.
It feels like a step away from seeing digital tools as mere archives; they start to act like collaborators. The result is a software platform that uses Google’s AI to spark fresh movement that still feels true to the choreographer’s voice.
*Back in 2019 I began this partnership with Google Arts & Culture Lab, hoping AI might open a two-way conversation with my quarter-century of work. The outcome, AISOMA, is a Google AI-driven choreography aid that throws out new, original dance ideas rooted in my own language.*
In 2019, I started a collaboration with Google Arts & Culture Lab to explore how AI could enable a more active dialogue with my 25-year body of work. AISOMA is a Google AI-powered choreography tool that acts as a creative catalyst by generating new, original dance rooted in my choreographic language. I initially used it in the studio to expand, challenge, and interrogate existing movement sequences.
Now, we're bringing a new version of this tool online for anyone to create with, as part of my exhibition: Wayne McGregor: Infinite Bodies at Somerset House. How You Dance with AISOMA You are invited to perform a short dance. A custom AI then analyzes your movement and extends your sequence with original choreographic phrases, all rooted in my movement vocabulary.
AISOMA has been trained on almost four million poses, extracted from hundreds of videos from my archive, spanning more than two decades of my work.
Can a machine really catch the subtlety of movement? Since 2019 Sir Wayne McGregor has been teaming up with Google Arts & Culture Lab, trying to get AI to talk with his 25-year archive of choreography. The output, AISOMA, is billed as a creative spark - it spits out new dance material that tries to stay inside McGregor’s own movement vocabulary.
How well an algorithm can actually embody that knowledge is still fuzzy. The public launch also throws up questions: will dancers and makers actually work with a tool that suggests brand-new steps? It feels like a daring test.
At the same time, it’s a tangible move toward putting AI into the rehearsal room. Whether AISOMA ends up as a regular part of practice or just a gimmick remains to be seen. Some critics will wonder if the generated moves are more than a remix of what’s already there.
Only time and real-world use will show if the system opens fresh creative doors or simply mirrors its source. For now, the collaboration hints at both the potential and the uncertainty of using machine learning in embodied art.
Common Questions Answered
What is AISOMA and how does it function as a creative catalyst for Sir Wayne McGregor's choreography?
AISOMA is a Google AI‑powered choreography tool developed in partnership with Sir Wayne McGregor and Google Arts & Culture Lab. It analyzes McGregor’s 25‑year archive of movement and generates new dance sequences that stay within his established choreographic language, allowing him to expand, challenge, and interrogate existing material.
When did Sir Wayne McGregor begin his collaboration with Google Arts & Culture Lab, and what was the original goal of the partnership?
The collaboration began in 2019 when McGregor approached Google Arts & Culture Lab to explore how artificial intelligence could engage with his quarter‑century body of work. The original goal was to create a system that could “listen” to the nuances of his archive rather than merely catalog it, fostering an active dialogue between dancer and machine.
How is the new version of AISOMA being made available to the public, and what implications does this rollout have for non‑professional users?
The latest iteration of AISOMA is being launched online, allowing anyone to create dance material using the AI‑driven tool. This public rollout democratizes access to a technology previously limited to McGregor’s studio, but it also raises questions about how well the algorithm can convey embodied knowledge to users without a professional choreographic background.
What concerns are raised about an algorithm’s ability to capture the nuance of movement in AISOMA’s output?
Critics note that while AISOMA can generate sequences that mimic McGregor’s style, it remains unclear whether an algorithm can truly reflect the embodied, tactile knowledge that underpins human choreography. The tool’s reliance on pattern recognition may miss subtle physical cues, prompting debate about the limits of AI in preserving artistic nuance.