KEEP IT TOGETHER

EXPLORING THE PROCESS OF CONVERTING SOUND TO MOTION, CAPTURED USING THE INTEL REALSENSE D415 AND VISUALIZED IN PROCESSING.

CREATED FOR USC CINEMA COURSE IML - 335

This piece was “filmed” using the Intel RealSense D415. The decision to capture movement in depth rather than RGB pixels was made to explore motion in a more immersive way. Rather than assembling a motion capture stage, the sensor was attached to a gimbal and made mobile. This, in a sense, emulated the traditional film set up while maintaining a sense of motion.

The largest restriction with this approach was framing the shot. Because the RealSense is only a sensor, it must be attached to a computer in order to operate and record data. Additionally, the only way to monitor the data stream from the sensor is on the computer itself, so the gimbal operator must monitor the off-board device the RealSense is connected to. In our case, this meant following around the rig with a laptop attached via a four-foot USB-C cord and watching the PC screen for framing.

The data stream was recorded in the RealSense SDK and the depth visualization was achieved in Processing. After importing raw RGB height maps (right) for every frame of the performance, a command ran through each pixel, offsetting the Z coordinates of a point cloud based on color value from the heightmap. Because both the performer and sensor move through space, different depth thresholds had to be applied to various parts of the sequence. These clips were exported and assembled in Premier where color correction was applied to stylize the piece and further define the figure.

The largest challenge of this project was working with the data from the RealSense. While depth scanning and motion capture are growing in popularity, there are still relatively few resources available for consumers. The RealSense is most commonly used in autonomous vehicles so the default file type is an ROS .bag (robot operating system).

In assembling the video itself, the goal was to preserve the raw feel of the data stream. This project was heavily inspired by abstract data visualization and the idea that a viewer can learn a lot about a process through observation. The piece begins and ends with two-dimensional height maps of the raw depth data to lend a view into the information held in pixels as well as to ground the viewer.

CREDITS


FILM -                                                                                                                                    LUKE QUEZADA


MOTION -                                                                                                                                     EILEEN KIM


SOUND -                                                                                                         KEEP IT TOGETHER BY FYFE

                                                                                                                                BENVOLIO MUSTIC LTD.


EXTRA ASSISTANCE -                                                                                                      JACK NEWSOME

Powered by SmugMug Owner Log In