7 Dec - 2018 (Last edit: 14 Dec - 2018)

Producing realistic animated film figures is a highly complex technical endeavour. Researchers from ETH Zurich and Delft University of Technology have now shown how drones can be used to greatly reduce the effort required in the process.

Drones are going to change the film industry in a major way. Researchers from ETH Zurich and Delft University of Technology were able to show about a year ago that spectacular, highly technical film scenes could be shot in a much easier way by using these mini aircraft. In a further project, which was presented at the SIGGRAPH conference in Tokyo in December, demonstrated that drones also have great potential for animated film.

Drones to replace dozens of cameras
“It’s a very time-consuming task to make figures look realistic in an animated film,” explains Tobias Nägeli, PhD candidate at ETH Zurich and co-supervised by Javier Alonso-Mora from the Department of Cognitive Robotics at Delft University of Technology. “For the figures to appear natural, the first step is to film an actor performing the movements. The second step is then to build the animated figure around this.”

In order to reconstruct the actor’s movements for the 3D animation, they must first be recorded with at least two cameras simultaneously. Sequences of motion that cover a great deal of space in particular create an enormous amount of technical work, so two well-positioned cameras should be able to cover the entire scene. This requires either installing numerous cameras in different places, of which only a few can be used at the same time, or other tricky installations.

This complicated technique may soon be rendered obsolete. Nägeli and his colleagues at ETH Zurich and Delft University of Technology have developed a system that, in its simplest configuration, consists of two commercially available drones and a laptop.

The drones follow the actor’s every move and automatically adjust their position so that the target can always be shot from two angles. This reduces the amount of camera work required, since the cameras only have to be in the spots where they are actually needed. Impressively, the system anticipates the actor’s movements in real-time and then calculates where the drones need to fly in order to keep the actor in the frame.

To minimise the volume of data generated, infrared diode markers are fixed to the actor’s joints. The drones, which are equipped with a true light filter, record only the light from the markers, greatly simplifying data processing. The system only sees a few points, from which it then determines the body’s position and directional movement.

Sport motion analysis
“What makes our system so unique is that it can also reliably capture sudden and fast movements,” explains Nägeli. “Of course, this kind of demo system is not good enough to meet the requirements of the film industry yet. But it does offer a promising approach.”

The team conducted various tests to show how the system can be used to track human movement over longer distances – something that makes the approach interesting for sport motion analysis. Until now, it has been impossible to perform a comprehensive motion analysis on runners, for example, because it is much too complicated. With the new system, it’s very easy to examine how a runner’s kinetics changes over a period of time.

More information
T. Naegeli, S. Oberholzer, S. Pluess, J. Alonso-Mora and O. Hilliges, "Flycon: Real-time Environment-independent Multi-view Human Pose Estimation with Aerial Vehicles", in ACM Transactions on Graphics (SIGGRAPH Asia), Dec. 2018. [PDF ] [video ]

Contact information:
Javier Alonso Mora (TU Delft) J.AlonsoMora@tudelft.nl
Tobias Nageli (ETH Zurich) tobias.naegeli@gmail.com



This website uses cookies to enhance user experience and to provide us with visitor analytics.