The Making of 'Floating'
January 6, 2015

The Making of 'Floating'

Greg Jardin is a filmmaker best known for his commercial projects with Radical Media. His recently completed solo short film effort titled “Floating” tells the story of a lonely character made out of balloons, struggling to find some sort of connection in the city.
The evocative animated/live-action film relied heavily on Maxon Cinema 4D workflow to meet the creative challenges he found bringing the character to life. Clearly, the effort has paid off – his production was even recognized with a Vimeo “Staff Pick” award. 

We spoke with Jardin about the film, his creative inspiration, and the technical workflow he used to create this moving piece. 



We understand that “Floating” is a personal endeavor. Tell us about the story and the inspiration for the movie.
The genesis for the film came from an idea I had for a music video a few years ago, but I didn’t pursue at the time, as I did not have the technical confidence to pull it off. 

I had done some simple motion graphics with Cinema 4D previously, mainly text extrusion/animations, but nothing anywhere near as demanding as ‘Floating.’  Gradually I became more interested in Cinema 4D and spent a little over a year learning the nuances of the software. The Greyscalegorilla tutorials especially were a great way to acquire new skills and build up the technical vocabulary I needed to finally bring ‘Floating’ to fruition.

Describe the production pipeline and which software tools you relied on to create the balloon character.  
We shot the live-action footage in downtown LA using Red One camera, and then used [Andersson Technologies’] SynthEyes to get the motion-tracking data. Footage was then imported into Cinema 4D, where all the character rigging/animation and balloon physics were accomplished. Once all the animation was completed, I rendered out multiple passes, which were composited and subsequently graded in Adobe After Effects. But the bulk of the character work on the project was done in Cinema 4D, while SynthEyes and After Effects were essentially used to composite the character into the various shots.


The balloons were attached to the vectors on splines (which were parented to invisible joints) using Mograph.

Getting the balloon character to react realistically must have been a daunting task. Were there any specific tools in Cinema 4D that helped you meet this challenge?
Once the basic balloon model was configured, I turned on the rigid-body dynamics in Cinema 4D, which allowed the balloons to automatically react to one another and to the floor. I also used the Xpresso tool to create the ‘magnetism’ of each point using follow position and follow rotation. The cloner tool was also invaluable in allowing me to attach balloons to specific points that were then ‘parented’ to the character’s bones. This helped avoid having the character look like a blob of balloons. Being able to adjust the rigidity of the balloons was also helpful in getting the physics of the balloons where I wanted them. 

Compositing animated characters into the live-action sequences is a complex process. Tell us about the workflow.  
I would generally export several TIF sequences from Cinema 4D for each shot – an RGBA pass, a motion vector pass, a depth pass, an AO pass, and a shadow pass (if need be).  Everything was composited in After Effects using ReelSmart MotionBlur and Frischluft depth of field. Often, I also had to color-grade the Cinema 4D export footage in order to match the grading of the Red footage. If necessary, I would rotoscope any foreground elements that were meant to be in front of the balloon character, as well.

As far as melding the Cinema 4D animation with the live-action characters, I would use JPG sequences of each shot as background plates in Cinema 4D so that I could see exactly how the live-action characters were moving or reacting, and then base the balloon character’s animations and reactions off that.

There were two instances whereby we shot separate plates of two of the characters with a fish-eye lens to use purely as reflection data to show up on the balloons themselves. If you look closely in the first scene at the intersection, when the girl walks past the balloon character and waves to her friend, you can see her reflection in the balloons. This was done using a TIF sequence of her that’s mapped onto a moving plane in Cinema 4D, so that her reflections would match her movements in the actual footage. 


A separate plate of the actress was shot with a 7d and a fish-eye lens in order to get her reflection on the balloon character.

The music perfectly captures the emotive mood of the film. How did music figure into the film? 
The Joy Formidable created the original score after the film edit had been completed. I had shown the band about 90 seconds of the film early on to give them an idea of the tone, but they didn’t really get into it until the cut was finished. They started composing with the idea of creating a melancholic piece that echoed the emotions of the main character throughout.  

What so far has surprised you most about the audience reaction to the piece?
It’s been great!  Probably the most surprising thing to me so far has been the different reactions to the ending. Some people find it depressing, while others finding it up-lifting. A friend of mine pointed out that someone’s opinion of the ending is almost like a Rorschach test in that it can reveal something about the psychological nature of the viewer, which is both funny and probably true.

Watch the movie at https://vimeo.com/103146755