SmartGeometry 2011: Sensor Data and Social Networks as the Fabric of Digital Design
Kenneth Wong
May 5, 2011

SmartGeometry 2011: Sensor Data and Social Networks as the Fabric of Digital Design

Workshop clusters explore environmental data-driven design
On April 1, a day reserved for practical jokes, participants of SmartGeometry 2011 (SG2011) got ready to present the results of their four-day workshops. The debut took place at a reception, hosted by the Royal Danish Academy of Fine Arts School of Architecture, about 20 minutes’ walk from the Copenhagen Opera House on the Island of Holmen.

For many participants, SG is a little bit like Disneyland, with one exception. At Uncle Walt’s theme park, the guests ride the attractions. At SG—a sandbox for exploring, testing, and prototyping design ideas—the guests build the attractions. The intense group experiments—involving a mix of computer programing, digital modeling, and physical construction—are roller-coaster rides in themselves, often beset by server crashes, technical difficulties, and equipment failures. By the end of the conference, SG participants have once again proven—at least in scale models—what many might consider impractical or impossible.


SmartGeometry cluster “Interacting with the City” created a digital physical-scale model of
Ofelia Beach, built on a Microsoft Kinect unit.


This year, SG installations included, among others, an interactive city model that displayed real-time wind patterns, digital models that could be reshaped by human touch, and Google street views augmented with 3D meshes representing carbon readings at the site. These works reflected the artists’ interpretations of the organizers’ challenge for 2011: to incorporate streams of data—such as user data, energy calculations, embedded sensing, and material behaviors—into the digital design process. The artists were academics, architects, and designers from Skidmore Owings and Merrill, Foster + Partners, MIT, the Institute for Advanced Architecture of Catalonia (Barcelona, Spain), and other international institutions.

Kinect with Ofelia, Digitally and Physically
Last year, SG workshop participants were asked to incorporate physical prototypes into their workflow. (For more, read “SmartGeometry: Madness with a Method” on www.cgw.com.) This year, in a challenge dubbed “Building the Invisible,” organizers asked participants to use the power of computation to help them incorporate data streams into the design process. “Computers help us collect, manage, and analyze the environment and inform us about an abundance of data. Our challenge is to use these inputs in a meaningful way to help us make better informed design decisions,” announces the organizers.


The tabletop prototype of Ofelia Beach, dubbed “Hands-On Ofelia,” allows users to interact
with real-time wind and climate data using blocks of geometry. 


Przemek Jaworski, a freelance computational designer and a tutor at TU Wrocław, Poland, joined the cluster called “Interacting with the City.” The tabletop-style scale model of Ofelia Beach that they built sat on a Microsoft Kinect unit, allowing observers to interactive with the digital data displayed on the model using fingertips. The data stream displayed—wind patterns, weather data, and others—came from online sources like Google Maps, Yahoo, and Twitter. Users could interact with the data stream by relocating blocks of primitive-shape geometry, turning the tabletop into a real-time simulation environment.

In subsequent experiments, Jaworski and his team members used Kinect to scan 3D freeform models on the table, sent the point cloud for analysis and normalization in Processing (programing language) and GenerativeComponents (Bentley Systems’ computational design software). Human and thermal comfort analysis was projected back to the model. The digital model reverse-engineered from the physical model was tagged with geographic location markers and sent periodically to an iPhone server to facilitate augmented reality.


The ambient sensor kit in development, to be used by the SmartGeometry cluster “Urban Feed”
to collect environmental data from various sites in the city.


RSS Feed for the Metropolis
Luis E. Fraguada, who holds a master’s degree in architecture and urbanism from the Architectural Association Design Research Laboratory (DRL) in London, returned once again to SG, to champion a cluster called “Urban Feed.” Venturing out into Copenhagen, Fraguada and his teammates went to work, each carrying an ambient sensor kit, a box designed to measure, among other things, light data, temperature, and CO2 readings. (For details on the ambient sensor kit, see sidebar, “A Breakdown of Urban Feed.”)

Current crops of architectural and design modeling software focuses on geometry; therefore, Fraguada and his team found little or no out-of-the-box functions to introduce real-time ambient data streams into their modeling exercises. “I think it is imperative that software makers consider how their users incorporate external real-time data into their workflows,” says Fraguada. “We also must consider that not all aspects of a project can respond with such granularity as would be needed by implementing real-time input. For our purposes, we used a combination of tools to make this possible. In all cases, some programming was implemented to bridge data and geometry. The software that was most useful in this respect was a combination of McNeel’s [NURBs modeler] Rhinoceros with the [plug-in] Grasshopper. We developed components for Grasshopper (with .NET), which made the communication between data and the digital model very accessible to all of the participants in our cluster.”


The use of NURBs modeler Rhinoceros and a generative-design plug-in called Grasshopper
allows Urban Feed team to turn sensor data into 3D shapes.


Environment data collected by participants were uploaded for processing via a number of social networks and online portals, including Google Earth, Twitter, and Pachube. Some data, such as CO2 readings, were used as 3D mass-sculpting parameters in Grasshopper, a computational design plug-in for the NURBs modeling software Rhinocerous. They confronted, among other challenges, converting Grasshopper-generated 3D geometry into location-coded KML format for display in Google Earth, and converting GPS data into a format displayable on a Cartesian plane. The result was a real-time Google Earth walk-through of the city, along with environmental data represented by floating meshes at the site.

“One of the key things we are discovering is that designers have a real desire to be able to control and affect the data that drives their work,” observes Fraguada. “I could see this type of personal data collection starting to call into question and provide alternatives to existing building and zoning regulations. There are many opportunities here for material specification, orientation due to air flow (which coincides with CO2 flow), strategic landscaping, etc. Again, I think the real challenge will be in how to negotiate real-time behavior in a process which is anything but real time.”


GPS-coded environment data collected in the field can be displayed in Google Earth, as shown
in this exercise by a SmartGeometry cluster. 


 A Breakdown of Urban Feed

The ambient sensor kit used by the cluster is made up of:
•   An Arduino Uno board powered by a rechargeable Li-ion battery pack
•   A micro SD card
•   A light-dependent resistor for sensing light
•   A passive infrared sensor to detect movement, temperature, and CO2
•   A GPS antennae to have a fixed position with each measurement

To aggregate, process, and stream collected data, the Urban Feed cluster used:
•   McNeel Rhinoceros 3D, a 3D NURBs modeler
•   Grasshopper, generative modeling plug-in
•   gHowl, a set of interoperability components, which (among other things) makes it possible to translate model data between Rhino and Google Earth in near-real-time speed
•   Arduino, Microcontroller IDE (used to program the kit)
•   Processing, an open-source programming language (used to communicate data from the kit directly to Pachube and Twitter)
•   Twitter, used to communicate the data to the outside world
•   Pachube, for sharing and streaming data sharing
•   Open Office, to combine data streams and clean data
•   Google Earth, to visualize data and cluster proposals in relation to its urban context

Social Construct of the Future
In the works of many SG workshop clusters, public data and open APIs (Google Map data and Pachube’s streaming features) played important roles in aggregation and transmission. It suggests that easy access to data sets and real-time streaming functions in social networks, along with the new generation’s preference to conduct collaborative experiments publicly, may reshape digital design practices.

SG participants’ works often go beyond standard practices in building and construction industries. Shaping digital design with streaming real-time data is, some might argue, a concept not easily adaptable to current design workflows. But let’s not forget that many of the SG participants consult for, and work with, globally recognized design and architecture firms. If anybody is poised to rewrite the rules, they are.

Kenneth Wong is a freelance writer who has covered the digital video, computer gaming, and CAD industries. He can be reached at Kennethwongsf [at] earthlink.net.