This yearâ€™s Game Developers Conference (GDC) was bigger and better than ever. There were new game development technologies and products, and even new game titles. Some faces were new, yet most were familiar. There were even a few vendors from outside the gaming world whose products and technologies have been adapted for this growing industry. As expected, Microsoft, Nintendo, and Sony had a strong presence. But so did the little guys, independent â€œgarageâ€ developers who never cease to amaze us. And then there were the recruiters, big-name studios scouting new talent, along with the many artists looking for employment.
Both Microsoft VP John Schappert and inventor/futurist Ray Kurzweil enthralled audiences with their keynotes. On the expo floor, the big buzz centered on the unique interfaces, hardly surprising given the huge success of the Wii and Guitar Hero. Speaking of the Wii, Nintendo demonstrated its Wii Fit, a new line of titles designed to increase fitness through balancing and aerobics. Of course, this is all done in a way that is fun. The games are played using the new Wii Balance Board (available in a few months), which is about the size of a bathroom scale. Not only does it measure the playerâ€™s center of gravity, but also the personâ€™s body mass index.
InterSense, for one, believes that the Wii is just the tip of the iceberg in terms of motion sensing and games. InterSense enables real-time tracking of position, orientation, and movement of people and objects. This technology has been successfully implemented in military, industrial, life-science, research, and engineering applications. InterSense motion-tracking systems also are being used within the entertainment industry by production companies, directors, and CG artists and animators. Soon they may be used by game players. The company was at GDC seeking partnerships to bring its advanced Motion Analysis Engine (MAE) to console games. MEA is based on six degrees-of-freedom (DOF) capture, compared to three or four DOF used in the Wii, so it considers the amount of force being applied, the angle it was applied, and the actual trajectory of an object. Simply put, it not only knows that a ball was thrown, but how fast it was thrown, whether a spin was applied, and where it was thrown.
In a novel approach, Neuro-engineering company Emotiv Systems debuted its brain-computer interface. In a nutshell, the technology analyzes facial expressions and body language to factor emotions and feelings into gameplay. Through a playerâ€™s expressions, an avatar reacts appropriately to the player inside a virtual world. Also, gameplay is adjusted automatically based on the playerâ€™s emotional responses. In essence, a player can control play with his or her brain. The technology is complex, but Emotiv has packaged it quite nicely within the EPOC headset. Available later this year, it will come bundled with a specially developed game, though the technology will work with existing titles.
Another forward-â€œthinkingâ€ company is NeuroSky, whose MindSet SDK converts brainwave (EEG) signals into digital mental-state output for brain-computer interface applications. Available now, the MindKit-EM SDK consists of a neural headset, with an integrated ThinkGear module and a sensing algorithm library.
In his keynote, Kurzweil touched on computer interfaces, predicting that by 2010, keyboards and mice will become obsolete, replaced by â€œinvisible devicesâ€ that disappear into clothing, for example, while providing a more immersive, augmented-reality experience. Soon, a mouse click will be a thing of the past.