Filmmakers Sam and Andy needed convincing animations that fit into a sci-fi narrative. And they needed to create those movements efficiently and authentically.
An easy-to-use mocap system that delivered high-quality data, capturing the quirks and nuances of the actors, and gave the animation team plenty of references to work with.
Set in a future where humanity is long extinct and the Earth is thawing from a nuclear winter, Love Me is a sci-fi arthouse film created by Sam and Andy Zuchero in which two artificial intelligences, a buoy and a satellite, fall in love. Played by Oscar-nominated Kristen Stewart and Steven Yeun, the machines slowly evolve into sentient beings, with the audience able to see the transformation before their very eyes.
Love Me Featurette, featuring Xsens mocap:
In Love Me, a buoy and satellite cross paths and begin communicating. The pair use the Internet to examine what human life was like before the fallout – and end up emulating a popular vlogger duo by creating avatars and eventually morphing to humanlike stature. Creating the three phases required a complex and detailed filmmaking process.
Sam and Andy Zuchero were responsible for almost all aspects of the film’s creation, from writing to direction to production, with the help of a dedicated team. A task of this size is no mean feat, especially when extensive VFX is involved. The first step in the production process was to film the native shots, to later be transformed into a post-apocalyptic landscape through VFX.
The camera crew set out across North America to find the perfect locations. “We shot the prop buoy and satellite with real props, to give a realistic, gritty feel to our characters,” explains Sam. “And the first scenes with Kristen and Steven were shot using motion capture at the end of 2021.”
Before production, Sam and Andy researched the best type of motion capture for Love Me, prioritizing efficiency without compromising on quality. “We realized that optical motion capture would be too obstructive, so the Xsens system was the best option,” Sam explains.
Xsens technology uses inertial technology that logs joint rotation, velocity, and position on the body. These are simply put on as a suit or multiple wearables and their data is instantly sent to processing software – no cameras needed. Traditional optical motion capture requires a large setup with good lighting, a dotted suit, and multiple cameras to film the subject, which wasn’t ideal for the intimacy required in Sam and Andy’s production.
“Using this type of motion capture means that the audience can see every quirk and tiny movement of the actor,” Andy explains. Even when the actors aren’t completely recognizable visually, viewers should be able to pick up on their body language.
Small studio, big ideas
The finished film was released by Bleecker Street in 2025 after an initial screening at the Sundance Festival in January 2024. Sam, Andy, and the Love Me production team successfully blended the two genres of romance and sci-fi, thanks in no small part to the help of Xsens motion capture technology providing an unobtrusive solution.
“From an actor’s point of view, it was amazing to include the actor’s intentions and have them translate even through animation,” Sam concludes. “You turn on the motion capture equipment and they can perform, without waiting for the correct lighting and camera setup.”
“We really appreciated how simple it was to realize our ideas using Xsens,” Andy says. Using this technology provided the team with an impressive finish to the film that shares the same quality as major Hollywood features, with a fraction of the time spent.
Take a look at Xsens motion capture products.