Veepers 3.0 - 3/03 - Part 1
Issue: Volume: 26 Issue: 3 (March 2003)

Veepers 3.0 - 3/03 - Part 1

By George Maestri

Ever since the Internet took off in the mid-'90s, companies have been looking for ways to create a more personal Web experience. To this end, Pulse has introduced Veepers—talking characters requiring very low bandwidth that can be added to a Web page.

Veepers is about as easy to use as software gets. The only thing you need to begin with is a photograph of a person taken head on. (At press time, Pulse had added the ability to create full-body Veepers.) This person needs to have a neutral expression on his or her face. You import this image to Veepers, where the process begins.

Veepers are Web-based talking characters—usually, though not necessarily, human. Image courtesy Pulse and George Maestri.





The process of setting up a Veepers character is divided into several simple steps. First, as mentioned, you load the photo. Next, you need to define where the head is in the photograph, a task accomplished by painting out everything in the background except the head. Veepers provides a number of paintbrush and lasso tools for you to use, but if the background has sufficient contrast, the software can automatically extract the head.

After the background is masked out, you'll go through a series of steps to tell Veepers where the major areas of the face are located. The package does its best to find features like the pupils, eyebrows, mouth, nose, and outline of the face, and you can then tweak what Veepers finds for an exact match.

After these steps are completed, finalizing the Veepers character is a matter of selecting a file size and saving the file. Once the file is saved, a Web page pops up where you can view the character as it would appear in your project. The Web page has controls for viewing the different expressions of the character as well as seeing how it lip syncs to dialog.

To accomplish this, Veepers uses basic animation principles as well as a bit of high technology sleight of hand. The pixels of the face are seamlessly warped to create facial poses. These are then driven by Veepers to give the illusion of facial motion. As the character talks, for example, the eyes blink and the head rocks back and forth to give it life.

While this is certainly impressive technology, I was not terribly impressed with the results. The human eye is sensitive to how a real face moves and behaves, and Veepers faces do not move realistically enough to fool the eye.

One remedy for this might be to expand the technology so that Veepers can drive other types of characters, such as illustrations or drawings. The human eye is less critical of these types of characters and the results might be far more interesting. Thinking outside the box, I took a photo of a brown bear and ran it through Veepers. The resulting character looked surprisingly good, mostly because a talking bear is so unexpected.

Veepers also includes an audio manager that allows you to create dialog tracks that a Veepers character can use. This can be done in two ways. The first is to take an existing audio track and read it phonetically. You can type in the text as a guide. If your system has text-to-speech software, you can have Veepers create the audio dialog for you. These files are then saved in a special streaming file format that a Veepers server can send to a Web page.

Web pages using Veepers characters can be built using Pulse's Structor module, which is also included. Structor takes an existing Web page and adds the hooks needed to publish Veepers-based content. Each Veepers character can display a range of emotions, which can be driven by events that occur on a Web page. One link might make the character smile, another might make it frown, others might trigger different lines of dialog. All of this is managed through Javascript, making it fairly easy to author pages. At press time, Pulse had added support for Java output and the ability to create Veepers with improved character behaviors.

As for applications, the software can be used just about anywhere a talking character might be needed. Advertising, training, and tech support are but a few applications that are demonstrated on the Pulse Web site. One excellent example was from Bud Light, which used Veepers on e-mail greeting cards.

Overall, Veepers has promise for some applications, though I'm not sure whether this technology will ever produce truly convincing animation. But it's an easy way for non-animators to quickly produce low-bandwidth characters for a Web page, and I think it might find a home in applications such as entertainment and training. ..

George Maestriis president of Rubberbug, a Los Angeles-based animation studio specializing in character animation.

Price: Perpetual licenses start at $50,000.
Minimum System Requirements: Windows 2000/XP; 500MHz Pentium III; 128MB of RAM. Server (for text-to-speech): Windows 2000 or Linux RedHat 6.2; 800MHz Pentium III; 1GB of RAM
Pulse www.pulse3d.com




Eyeon upgrades its compositing program

By Doug King

The new version of Digital Fusion, the compositing program from eyeon Software, is fully resolution independent and can work in either 8-, 16-, or 32-bit floating-point color processing, all within the same project. It is now also fully multithreaded.

The first noticeable change over previous versions is the interface, for which eyeon has used a dark, gray tone, which gives the user a more streamlined environment in which to work. More important, though, is the change from one small display window in Version 3.1 to two large side-by-side displays. There are actually four basic modes in which you can set your display windows, and you can modify these by dragging the corner of a display to the desired size.

p--Among Digital Fusion 4's new features is the ability to modify the sizes of display windows. Image courtesy eyeon Software.

I have always enjoyed working with Digital Fusion, partly because the workflow for setting up composites and complex effect shots is intuitive and quick. One of the drawbacks has been the inability to view a compositing project in the same 'real-time' playback that you can with an editing project. This has changed in Fusion 4 with the addition of RAM caching. This caching 'remembers' frames previously rendered, and holds them in RAM for immediate playback and processing. This will give any compositor great performance gains, because you can view a project in real time by clicking the play button on the transport controls. You can play your scene backward and forward and even navigate through time using the enhanced playback buttons found in the time ruler.

What makes RAM caching even more significant is that it can be used to accelerate final rendering as well. Fusion will only re-render the frames that have changed since that last time you cached a scene. To set up the RAM cache, you first define a render range of frames in the timeline, then turn on the loop button in the transport controls, then push Play. Fusion will run through the selected range of frames and store them to cache. A green line appears along the bottom of the timeline, showing you the frames that have been cached.

Eyeon has worked hard to make the rendering of effects easier and quicker. Along with the RAM caching enhancement, Fusion 4 includes a completely redesigned Render Manager, which allows the computers on your network, or slaves, to be organized into groups. Flows are sent to one or more groups, and any slave that is a member of a group will immediately start rendering the flow. Slaves can be members of more than one group, with groups being given lists of priority. If a slave is rendering one flow for a group and a higher priority group receives a flow of which that slave is a member, that slave will immediately stop rendering the current flow and begin rendering the new higher priority flow. If the higher priority flow finishes rendering before the original flow has finished then that slave will return to rendering that original flow.

The end result of this is that queues will no longer be held up by the speed of the slowest computer in the rendering farm. Both RAM.caching and the Router Manager should increase productivity and decrease the amount of work needed to oversee the rendering farm.

With Version 4, scripting has become an integrated part of the workflow and can run from within the interface or directly from the command line, and be executed on local or remote copies of Fusion and Nodes. This scripting allows users to have complete control over Digital Fusion's tools, parameters, and animation features. Eyeon offers pre-built scripts for the more tedious workflow issues.

Enhancements to tracking have been included with a completely revised tracking interface. Fusion 4 allows the tracking of up to 20 patterns within a single tracker tool, and that data can be used to perform stabilization, match moving, and corner or perspective positioning. Another important addition is the Grid Warp Tool. This is a simple 2D grid deformation that gives users control over source and destination grids. With this tool you can produce UV mapping coordinates that simplify the act of complex organic deformation. What is really cool is that the source and destination grids are capable of independent animation.

It's difficult for me to find anything to complain about with this program. One thing I would like to see added is the ability to work in 3D space with camera, lights, and particles, as do many other compositing packages. All in all, I am still just scratching the surface of what Fusion 4 has to offer. It's a useful tool for film and video compositors who don't want to spend $50,000 or more for a program. ..

Douglas King is a writer and animator based in Dallas, Texas. He is currently developing animated projects for his company, Day III Productions.

Price: $4995
Minimum System Requirements: Pentium Pro processor; 256MB of RAM
Eyeon Software http://www.eyeonline.com.