A Flair for Innovation
Issue: October-November-December 2022

A Flair for Innovation

Tracking the evolution of motion control robotics in VFX.


The history of motion control reflects the history of VFX itself in many ways. If you look at the hardware, it has evolved over nearly fifty years from jerry-built, individual set-ups kludged together by VFX departments to sleek, high-performance machines capable of high-speed, supremely accurate movement time and time again. And if you look at the software that controls these machines you can trace the same sort of path, one that goes from bespoke coding to the sort of elegant user interface design that allows a wide range of non-technical users to simply jump in and start working with motion control straight away.

At MRMC [Mark Roberts Motion Control] we first started developing our software control system Flair in the early 1990s. Motion control had been digital for close to a decade, but the new breed of standardized, turnkey robots that we were building to satisfy growing demand, such as the ground-breaking Milo, needed equally standardized, powerful software. 

The first rig running Flair software, a Cyclops, landed at CellAnimation in 1993. It was genuinely revolutionary. There were other motion control vendors active in the market, but they tended to either produce the robotics or the software. MRMC uniquely did both, allowing us to iterate and develop the software at a speed according to customer requests and move motion control out of the realm of being a pure engineering solution.


Decades of optimization


That is very much where we are now after three decades of development. The motion control market has changed substantially over that time, to the point where the rental houses that used to make up all of our customers are now only about 50%. The other half is represented by a growing number of studios and filmmakers who are creating their own content using the technology. They haven’t come from an engineering background, or sometimes even from a technical one, and they appreciate the decades of optimization that we’ve put into Flair to make it progressively simpler and more powerful to use. 

Like a lot of software, from motion control to everyday packages such as word processing and spreadsheets, the actual raw features have been established and optimized a long time ago. The way Flair controls the various rigs in our armory, from the smallest to the largest, is a known quantity. It’s hugely sophisticated, giving users the ability to produce simple arcs, orbits, and linear motions, and running up to much more advanced features and control, resulting in pixel-perfect repeatable motions every time. And it just works.

So why do we keep pumping resources into its continual development? Because it has to reflect the way that industry changes around it. Motion control is part of a workflow, and that means that our robots need to work with computer graphics systems and lighting systems. There’s a huge buzz around virtual production at the moment — there are very few of our customers now that haven’t dabbled in virtual production in one way, shape, or form — and that means our rigs also need to be able to interface with Unreal Engine. Development never stops. 


New features


That’s why we are still integrating new features into Flair that reflect the way that people work. Focus Assist is a good example and is enabled by a small hardware add-on to the front of the camera that piggybacks on the power and data wiring already existing for the lens motors. This removes the need for manual tape measurements to be taken in order to achieve crisp focus on whatever object or objects are being filmed, knocking out one more job on set to make productions faster and also more biosecure for pandemic-era workflows. Users just click on the object in the Flair screen and the software automatically measures the distance to the target, allowing the user to swiftly adjust the focus as needed through the live video view. 

Flair can also be synchronized to the pulse signal of a wide range of video and film cameras, allowing you to run repeatable frame-perfect moves every time. Timecode and input trigger pulse signals are also supported, which means the move can be configured to automatically start at the exact moment without any guesswork. But because we’ve been working with Flair for so long and have it so closely integrated with our robotic arms, it doesn’t stop there. We have a feature called Browse Moves which enables you to effectively scrub through the motion much like you would on an editing timeline. A lot of motion control systems have an industrial heritage; once the move is programmed the system then just executes it from start to finish. Ours allows it all to be finessed, at maximum speed or frame by frame if you wish; as you move the cursor so the arm moves with you in lockstep, whether that’s a tiny Bolt Mini or a 2.5-ton Cyclops.



And we are constantly looking for new ways to let people access the technology. We have a new iOS app that effectively enables artists to ‘fly’ their phone through the scene and that same motion and those same coordinates are then applied to the robot arm. Where their phone was is now a camera on the end of a Milo or a Bolt arm.

It’s the sort of tool that the early pioneers of motion control in the 1970s could only really have dreamed about. But motion control is one of those technologies where experience really counts. And when you have decades of experience on both the hardware and the software side of the business, you can use all those discussions with customers about the way they wished that motion control operated to make sure that is the way it actually does in the real (or virtual) world. 

Assaff Rawner is the CEO of MRMC

Subscribe to the free digital edition of CGW Magazine: https://bit.ly/3N8BmJx