To hear many future-of-work theorists tell it, the manufacturing workspace of the future is one that will be increasingly filled with humans and robots working together. These collaborative robots, or cobots, work alongside humans without any separation. Moreover, they're easily reprogrammed, making cobots ideal for repetitive or dangerous tasks humans would prefer not doing.
But how do you know if the cobot is safe to work around humans? How far should the arm move? How fast? Figuring out those details is the job of researchers like Marco Capuzzimati, a research fellow at the University of Bologna who use markerless motion-capture software tools, like iPi Motion Tracker, to work out the range of motion details using the software before trying it out in the actual workplace.
In fact, scientist and medical researchers discovering innovative ways of incorporating motion-capture technology led iPi Soft to create the Biomech Add-On, designed to enable visualization of tracking data for gait analysis and rehabilitation, sports motion analysis, and research in 3D human kinematics.
Here, Capuzzimati discusses his work and how he’s using this technology.
Tell us more about your work.
The project aims to create a management and feeding system for a collaborative cell, represented by a human operator and a cobot, or a collaborative robot. We are developing a system capable of being autonomous in the feeding phase — such as selection, picking, and storage — of parts and components involved in the production process.
How are you using mocap to analyze movements? Is this being done in the assembly cell area of a warehouse?
We stream mocap data from iPi Mocap Studio 4 to Unity 3D, where we can do a biomechanical analysis of the movements both in the warehouse and in the assembly cell.
Can you provide additional detail on the biomechanical analysis?
We have developed a C-sharp code to handle the incoming mocap data. The data is then sent to a Microsoft HoloLens, where we can do biomechanical analysis. We are particularly focused on the angles between certain joints so we can determine critical configurations.
How does motion-capture technology help in your research?
Motion capture is helping us in warehouse settings where the cobots and people work together to load components and various parts onto warehouse shelving, and in the assembly phase with support from cobots and people doing tasks that cobots can’t do. The ergonomic design of the assembly cell is very important to maximize operator satisfaction and system performance, so the main objective of motion capture in this project is ergonomic optimization and risk assessment following the execution of certain assembly or load-handling activities. In this way, we can optimize system performance and reduce the risk of injury.
Can you describe the assembly cell in greater detail? Is the assembly cell a specific area of a warehouse where the robot is assembled?
The entire area of the Pilot Line, as we call it, consists of a warehouse setting with inclined shelves. An assembly cell is formed by a fixed cobot and a flexible working table; this is the main area of interest. These elements allow cobots to carry and move materials and components from a warehouse to the assembly cell.
The main objectives are real-time biomechanical analysis of human operators and application of the ergonomic index for risk assessment during work activities. We have to analyze every single movement during assembly and consider the weights of the parts and components handled, the frequency of handling, and the efforts that the operator has to make.
Finally, how long have you been conducting the motion-capture research at the University of Bologna? Is there an end date?
My research in the industrial engineering department began in September 2020 and is scheduled for completion in July 2022. I’m looking forward to future projects and opportunities to deploy this motion-capture technology.