Taking Flight
Issue: Volume: 33 Issue: 1 (Jan. 2010)

Taking Flight

On April 25, 2004, at 3:42 pm central daylight time, in the Bayou de View area of Arkansas’ Cache River National Wildlife Refuge, while riding in a canoe, researcher David Luneau videotaped a bird many believed to be extinct: an ivory-billed woodpecker. Or, did he?

Luneau, an associate professor at the University of Arkansas, along with scientists at the Cornell Lab of Ornithology, made their case in an article published in Science magazine, to much excitement. But, eventually, some experts and academics disagreed, saying that the four seconds of videotape showed a blurry pileated woodpecker, not the rare, possibly extinct,
ivorybill.


The ivory-billed woodpecker above is digital, created at Cornell University.

That’s when professor and computer graphics legend Donald Greenberg and several colleagues, notably graduate student Jeff Wang (now a character TD at PDI/DreamWorks), enter the picture, as does the Cornell Ornithology Lab. Because the two woodpeckers in question have black and white wings with opposite patterns, Greenberg and his group suggested that maybe they could animate an ivory-billed and a pileated woodpecker, simulate the camera in the videotape, and create sequences of images for each woodpecker that could be pattern-matched and compared to the video.

This simple suggestion attracted the attention of Kim Bostwick, a research scientist in the Cornell Ornithology Lab, and set in motion a series of projects that continue today. Wang built an accurate 3D representation of the ivorybill that has become a research vehicle for David Kaplan, another graduate student in computer graphics. Brendan Holt, a third CG grad student, collaborated with Bostwick to develop a method for motion-capturing wild birds in free flight that she is now adapting for a field study of manakins. “These are astonishing little birds found in Central and South America that sing with their wings,” she explains. Throughout the process, Bostwick, who uses high-speed video and detailed anatomy to study the functional morphology of birds, worked with the CG students, developing field methods and evaluating the model and the animation.

“[The computer graphics students] traversed a very great distance from beginning to end,” Bostwick says. “In the first one or two weeks we started working together, they had us look at an animation they had put together. To most people, it would have looked good. But we saw birds with woodpecker feathers flying like vultures. From there, they developed a sophisticated model that incorporated all sorts of things about anatomy and how birds fly. And, they were asking questions most people don’t ask ornithologists, like how feathers work and what their motion is like, things we don’t know. We have a lot to learn about how birds fly.”


The ivorybill (below, right), with a white trailing edge on its underwing, differs from the pileated woodpecker (below, left), which has a black trailing edge.

Building the Bird

The first step was to build the model. Soon, the simple idea of matching the patterns on the wings evolved into creating a model that precisely matched an ivorybill. “Once we started getting into the literature from the ornithologists, it became more and more important to get it scientifically correct,” Wang says.

Bostwick, who is the curator of the bird collection at the Cornell University Museum of Vertebrates (CUMV), provided the CG students with access to museum specimens normally available only to biologists. And, she made it possible to scan a one-of-a-kind specimen of an ivorybill, which had been stored in a jar at the Smithsonian for 60 years, with high-resolution computer tomography (CT) at the University of Texas at Austin’s Digital Morphology (DigiMorph) lab. The lab scanned the bird in a natural pose, with the wings tucked, and with the wings slightly open—the specimen, which is the only fully intact one in existence, was too fragile to risk fully opening the wings.

From the volume data that resulted, Wang reconstructed the bird using Template Graphics’ Amira visualization software for medical imaging. The scans had produced approximately 2000 slices, each with a resolution of 1024x1024. Rather than rendering a volume, Almira’s thresholding algorithms generated an outline of the skin surface and contours for the skeleton. The process was far from automatic, however. For example, because the feathers and skin had the same density, the CT scan didn’t distinguish between them, so Wang needed to separate feathers from skin. After editing, Wang had an accurate surface representation of the ivory-billed woodpecker’s complete skin and skeleton in two poses.

Wang then moved the data into Autodesk’s Maya and used the reconstruction as a reference model to create a lighter-weight model for animation, with joints in correct locations and a subdivision surface of the skin. He also referred to the stuffed ivory-billed woodpeckers from CUMV for external measurements. To best approximate the reconstructed model, Wang “snapped” the majority of the vertices in the base mesh for the skin to points in the reconstructed model, which was dense enough to make this possible.

That gave Wang the skin and a rig. Next, he needed to model the bird’s feathers. “Because the people scanning the bird were afraid to unfurl the wings, we didn’t get great information on the width of the feathers from the CT scan, but we got the length,” he says. “However, we were concerned about the pattern the wing creates, not the microgeometry, so I wrote a tool to model the plane of the feathers quickly.”

Animating the Bird
The reconstructed skeleton from the CT scans had provided precise geometrical information about the bones in the woodpecker’s wings, but because the model was one solid object, Wang estimated the center point for animation joints. He would fly the bird using traditional keyframe animation; simulating the laws of aerodynamics was beyond the scope of his thesis. However, Wang leaned on ornithological research for the rotation angles, specifically on work by Ken Dial, who plotted rotation angles for the joints of a European starling. To animate the joints in his CG wing, Holt used the angles provided by Dial as rotations for forward kinematic controls. In addition, by incorporating information from a Science paper by Farish Jenkins that described the furcula, or the wishbone, as a spring, he added secondary motion.


The digital wing, with red rods representing individual feathers, flaps using IK and data captured from a red-winged blackbird in free flight. Overlaying the flapping wing on a video of the bird showed how closely the digital wing matched the bird’s motion.

“At that point, we had an ivory-billed woodpecker that flew like a starling,” Wang says, “the same cycle over and over.”

In addition, the team shot a high-speed video of a flying pileated woodpecker, the common but very large woodpecker that most resembles an ivorybill. By rotoscoping this bird in flight, the animators could match the rotations of the wing bones during each wing beat. The high-speed video of the pileated woodpecker and still photos also showed that in the transition between a downstroke and upstroke, the feathers oriented themselves in a nearly parabolic, continuous surface, and bent in response to aerodynamic loads during a wing beat—all of which affected the amount of black and white visible to the camera. The Cornell team mimicked these effects using mathematical orientation constraints.

The next steps were tracking Luneau’s camera in the video to discover the path it took, and then matching it to a digital camera. Wang did the match using 2d3’s Boujou. Once he knew the camera’s path, he could calculate the path of the bird.

“The ornithologists at Cornell who searched for the bird in Arkansas knew about how far away the bird was when it first took off from the tree,” Wang says. “So we asked them to go back to the same place and measure how high they thought the bird was off the ground. They went back to Arkansas and measured the tree.”

Knowing the height of the tree and the distance from the camera, Wang could calculate the position of the bird in 3D space when it took off. “For the rest of the flight path, we drew a ray from the camera, starting with the first position of the bird,” Wang explains. Known data provided flight speed for the birds, and from that, they could calculate how far the bird flew within a frame.

“Because we knew the 3D location, how fast the bird flies, and how much time passes between each frame, we could calculate a distance within one frame,” Wang says. “That distance became the radius of a sphere. We knew the bird was somewhere on that sphere, but not exactly where. So to figure that out, we lined up the camera, an image plane with the video, and the sphere in 3D space. We then drew a line starting at the camera through the center of the bird in the video. The intersection of that ray and the sphere was the location of the bird. We did that for the entire video, which was only 100 frames or so to get a three-dimensional flight path for the bird.”

Then, Wang animated the ivory-billed woodpecker model on that flight path, scaled the bird down to the size of the smaller pileated woodpecker, changed the colors on the pileated woodpecker’s wings, and repeated the flight.

“That gave us three sets of images,” Wang says, “the original video, our animation of the ivory-billed woodpecker, and the animation of the pileated woodpecker.”

Did either animation match the video? “This is where it gets tricky,” Wang says. “We made so many assumptions, if we were to take what we did to the scientific press, it wouldn’t hold water.”

Greenberg agrees. “It’s inconclusive,” he says. “I find that with the data we have, it’s impossible to make a conclusive statement. I could take a subset of frames and say, ‘Unquestionably, it’s an ivorybill.’ And then take another subset of frames and it’s not clear. I wish I could come up with a different answer. My buddies at the Cornell Ornithology Lab were praying that I could come up with a different answer. But the bird in the video was too far away, and it was a bird escaping. At its closest point it was 400 pixels. If you put your thumb on a normal-size TV screen, that was the image of the bird.”

More important, except perhaps to the ornithologists who were hoping for proof, the project has opened new lines of research at Cornell.

“None of us want to get into the debate about whether it’s ivory-billed or pileated,” Wang says. “That was the launching point, but it isn’t the be-all, end-all of our project. It is the various disciplines coming together.”

Motion-Capturing a Wild Bird
If you were to create an animation of a bird, of a woodpecker or any bird, for that matter, how would you know whether that animation was valid? Researchers have studied birds in wind tunnels, but the unnatural airflow in the confined space might have affected the birds’ flight. Holt wanted to capture data from wild birds flying naturally and use that data to drive a wing. Wang, Kaplan, and Bostwick helped make that possible.

“We couldn’t just capture wild songbirds,” Holt says. “You have to have a licensed ornithologist work with you.” Bostwick strung specially designed fine nets that the birds couldn’t see between trees, put birdseed on the ground, and captured, over time, a variety of birds. “We tried robins, chickadees, swallows, grackles, woodpeckers, and red-winged blackbirds,” Holt says. “The little birds are easy to catch, but they fly fast. The big birds flap their wings more slowly, so we could get better data, but they’re really smart. The grackles eventually figured out where the mist net was, and we never caught one again.”


A CT scan of an ivory-billed woodpecker specimen from the Smithsonian helped the Cornell grad students create an accurate skeleton.

Bostwick released the birds in a flight tunnel, a wooden structure a little more than 13 meters long with ports along the length. A viewing pyramid juts out from one side, and another port sits on the top for looking down. When released, the wild birds see the light at the end of the tunnel and fly through to escape. In general, the usable flight data lasted less than one second.

“We thought woodpeckers would be a good choice because of their patterned wings, but they flew into the tunnel, landed on the wooden ceiling, and stayed there,” Holt says.

Instead, they settled on red-winged blackbirds. To capture the bird’s motion, Holt applied markers hole-punched from 3M retroreflective tape that they could easily remove after the test. “Spheres would have been visible from more directions, but they would have interfered with the flight,” he says. Bostwick’s knowledge of bird anatomy was invaluable as Holt determined where to apply the markers.

Because his goal was to capture bone orientation and feather deformation, he attached markers at the shoulder, elbow, and wrist joints, at the tip of the hand, and on the feathers. “Bird wings have basically the same bone structure as mammal arms,” he says. “They have a humerus, radius, and ulna for the forearm, and a simplified hand with kind of a thumb bone and two digits that are a little bit flexible in some birds.”

Although acquiring the best estimate of joint center rotation might have meant adding markers midway on the bones, they decided to minimize the number of markers for this pilot study. “We weren’t sure how well this would work,” Holt says. “And it’s difficult and confusing to digitize lots of markers, so we kept it simple. Perhaps later, people will decide to continue the study using more markers.”

Because the cameras would see both sides of the feathers during flight, the researchers applied markers to the tops and bottoms of a few feather tips, as well as midway along the length. “We figured that the most interesting deformation occurs near the distal part of the wing, near the hand,” Holt explains. “The first nine feathers from the wing tip come off the hand bone. The feathers bend, rotate around the axis of the rachis, which is the quill, and splay like a fan opening and closing. We couldn’t capture feather twisting with our limited set of markers, but we could get bending and a little bit of the splay.”


With help from Cornell ornithologist Kim Bostwick, CG grad student Brendan Holt attached retroreflective motion-capture markers to the joints and feathers of a red-winged blackbird.

Holt and Bostwick positioned two Fastec Imaging TroubleShooter cameras inside the tunnel—one behind the bird and off to the side, and one mostly to the side and low to the ground. Kaplan built a custom light mount for the cameras to provide the bright light necessary for the high frame rate: The digital video cameras shoot 1280x1024-resolution images at 500 frames per second. So that they could locate points in 3D space later, the researchers placed a cube of known size in the region to calibrate the cameras.

To track the markers on the birds in flight captured by the video cameras and triangulate them, Holt used a MathWorks’ MatLab script written by Tyson Hedrick. “That gave us dots moving in space,” Holt says. “We fit splines to their trajectories.”

And that gave Holt paths in space and time. “It didn’t let us see the wing, though,” he says. “Initially, we treated each marker as a vertex of a mesh and connected the vertices to make a face. The resulting wing mesh gave a crude impression of a flapping motion, but it did not portray the underlying anatomy. So we built a more informative visualization using rods to represent feathers and bones.”

To do that, Holt created a simple polygonal model in Maya and rigged it with inverse kinematics (IK) to drive the animation. Then, he applied the data from the motion capture to the joints. “The model has something like 18 feathers, but we had markers on only a few feathers,” he says. “So we created a NURBS curve from the outermost feather tip to the shoulder using the other feather tips as control points. Then we evenly subdivided that curve by 18, the number of feathers.” As the wing bones fold and open during flight, driven by the motion-captured data, the curve changes shape and size. The wing flaps.

“That’s how we fit the feathers to the data,” Holt says. “The feathers fan in and out and make a smooth surface. So, we have a model of a wing that we’re driving with real motion data. It isn’t an accurate simulation. The results are somewhat interpreted in the way we fit the feathers and bones to the data with IK. We had only two cameras, so we had missing data. And, it’s hard to validate our work because there is very little research showing joint rotations of bones. But, we showed that it’s possible to get free flight data from a bird without relying on a wind tunnel.”

Flapping Wings
Kaplan, who had studied mechanical engineering as an undergraduate and moved into computer graphics as a graduate student, took the work done by Wang and Holt to the next scientific level. First, using a CUMV specimen, he scanned individual feathers of a red-winged blackbird, the same type of bird that Holt had motion-captured in the tunnel, with a 3D laser scanner. Then, working in Maya, he changed Wang’s 3D model to match. “Jeff had made a very, very accurate 3D model,” Kaplan says. “I changed the length and parameters to make it the size and proportions of a blackbird, removed the ivorybill feathers, and attached blackbird feathers.”

The CG feathers in the digital red-winged blackbird (at left), which have hundreds of polygons and are topologically accurate, help researchers look at flapping flight. Motion captured from a blackbird in free flight (above) can validate the digital animation.

The feathers have hundreds of polygons and are topologically accurate. “Jeff’s model was so morphologically accurate that I wanted to stay true to that attention to detail and keep as high a degree of accuracy on the feathers as I could,” Kaplan says.

Kaplan had “printed” Wang’s model of the woodpecker wings using a rapid-prototyping machine, and put two wings in a wind tunnel. “I measured the force data at a couple different air speeds and angles of attack. I also used a strobe light that illuminated helium bubbles to visualize the flow,” he says. The wings were in a quasi-steady state, however. So, to look at flapping wings, he decided to use the digital blackbird feathers and computer simulations.

“I’m doing fairly rudimentary aero­dynamics using Brendan’s [Holt] wing beat of a red-winged blackbird and applying blade element analysis to each feather as the bird flaps,” Kaplan says. “No one has done this before with an emphasis on geometry of this detail. Zoran Popovic wrote a paper a few years ago [“Realistic Modeling of Bird Flight Animations,” with Jia-Chi Wu, for SIGGRAPH 2003], but he modeled his feathers as two triangles with a hinge. And, he didn’t validate the animation against motion-captured data like that Brendan [Holt] came up with.”

What Kaplan hopes to achieve is a predictive model for flapping flight. “I don’t know if we’ll ever get there,” he says. “We’ve studied non-flapping flight quite thoroughly, but as soon as something starts flapping, it’s hard to predict the forces on that flapping body. There’s an important interaction between the feathers and the air. Air pushes on the feathers, they bend and push back. It’s like a spring-mass relationship.”

“Brendan had markers along the length of the feathers so we can see the way they bend,” Kaplan continues. “But, a feather twists around its axis to some degree, and that affects the aerodynamics enormously. A small feather twist can create a different angle of attack. Even though Brendan’s data was accurate and great to work with, he didn’t capture a number of degrees of freedom. It might not be possible. So, I’m trying to tweak the feathers by hand to see how much they change the lift and drag. I need to look at air speed and angle of attack for each polygon.”

Someday, Kaplan’s research might help aerospace engineers design small aircraft with flapping wings. “This is where it’s all headed,” he says. “We want to learn how to take advantage of the loopholes in the laws of aerodynamics that birds take advantage of by design.”

Into the Wild
Bostwick had proposed a field study using motion capture long before meeting the CG team, so the collaboration was timely and fruitful; it helped her develop a protocol for the grant she received for the manakin study. Recently she led a team that took the system designed with the CG students into the mountains of Ecuador to capture the motion of manakins in free flight.

“I would have had no idea where to begin [designing the motion-capture system] without Don [Greenberg] and his students,” Bostwick says. “The lenses, the sensors, the need to calibrate, the different perspectives, and then the whole world of taking data and putting points on a screen through time?.?.?.?that whole process of motion capture. I had no idea how to do it. These are incredible tools. And, most biologists don’t know about them.”

Bostwick and her crew packed 350 pounds of equipment—two TroubleShooter cameras, which can run on batteries, audio equipment, syncing devices, calibration cubes, and more—to study birds that make sounds with their wings. “The very instant they produce a sound is important,” Bostwick says. “So we have a customized device to sync the audio and the high-speed video to one millisecond.”

The students will capture the manakins and, as did Holt with the red-winged blackbird, apply markers, then release the birds from a perch onto which they’ve attached the two cameras. To calibrate the cameras, the computer graphics students helped invent a special device.

The Tinkertoy calibration “cube” looks something like a model of an atom—plastic balls with stems attached to form a 3D lattice that is firm, transportable, and easy to disassemble.

“We put the bird on the perch, and after it flies away, we place the calibration cube on the perch and record it with our cameras using the same focus and zoom,” Bostwick explains. “By pointing a laser from the cameras to the display perch (often many feet above the ground), we can measure the distance. Then, we can calculate backward to find points on the wings to create a volume.”

With Bostwick on the field trip is one of Greenberg’s students.
“It’s been a really fruitful collaboration,” Bostwick says. “In my world, when people study bird anatomy and how it functions, they’re pretty much limited to pigeons trained to fly in wind tunnels under specific circumstances. I wanted to become independent from the lab. I wanted to find creative methods to get information from wild birds doing their thing.”

Thanks to Don Greenberg, a computer graphics pioneer who has long advocated collaboration between computer graphics and many departments, and the hard-working students he inspires, Bostwick’s dream has become a reality.