In the near future, performers may have their costumes projected directly onto their bodies. As they move around the stage, the projections move with them, adjusting to subtle gestures and cues in 3D and real time. An outfit change would be as easy as flipping a switch.
It’s actually not so far off—just watch EXISDANCE, a performance backed by a new real-time tracking and projection-mapping system created by Panasonic. The collaboration between Panasonic and creative studio P.I.C.S. was premiered at this year’s Infocomm in Orlando, FL in June.
Performed by Kikky, a respected robot dancer who once appeared in a Missy Elliot music video, EXISDANCE features stunning graphics projected directly onto his torso and the white space around him, as he enacts karate-inspired moves. The 3D functionality allow the graphics adjust to his body’s position while staying centered on his chest. The graphics combine traditional cultural references, including the Itsukushima Shrine and Nagoya Castle, with technological ones, such as Tokyo Tower, robotics, and an exo-suit. The movements and graphical changes are practically seamless.
“There is of course a timeline for the projection,” Paul Lacroix, one of the project’s technical directors, tells Creators. “Like in any similar show you have the music and the video that is matched with the music, but besides that everything is motion-tracked. […] It’s really tracking him and the position of his body in real time with the projected image on his body. It’s very fast.”
Whereas other motion-tracking systems experience a period of lag—or latency—between an object’s movement and the projector’s response, the Panasonic system used in EXISDANCE is almost instantaneous, or at least not easily detected by the eye. As Kikky floats across the stage, the projection appears to follow without any delay.
“[The system] tracks perfectly so the dancer can add a different movement or element [during the performance],” says Tateha Sakamoto, another technical director for P.I.C.S. The eventual choreography took a number of rounds of testing, as it evolved with the Kikky’s movements and suggestions from the director, Hayato Ando.
For the system to track so precisely, the performer needs to wear a special outfit with a few dots as reference points. A less sophisticated system was demonstrated at Infocomm 2017, which let anyone hop on stage and test it out, though there was a more noticeable delay.
One of the team’s biggest challenges has been to convince video viewers that the graphics weren’t added in post-production.
“Many people saw the video and think it’s done post production because it’s so accurate,” Lacroix said. “The accuracy [of the projection mapping] so high.”
Moving forward, they want to push the system to its limit, adding more dancers in even more complex routines. They also plan to test different ideas to convince viewers that the projection mapping really is in real time by adding more improvised and spontaneous movements.