PLAN 86: Article
Creating Responsive Environments

Making Buildings More Sensitive to Their Users and Their Usage

With modern buildings increasingly controlled by computerized infrastructure for heating, cooling, lighting, etc., opportunities for optimizing energy consumption and climate are likewise on the increase. But many buildings continue to be run inefficiently because of the nature of their control systems, systems that are often constrained by limited sensor input and badly designed user interfaces. 

Researchers in the Responsive Environments Group at the Media Lab are exploring an array of approaches to overcome such limitations.  One of those approaches addresses the problem of control panels that are difficult to decipher because the controls are badly grouped and mapped or too abstracted from their immediate intention.   Relight, an intuitive gesture-based system, uses a wireless handheld device to control indoor lighting without confusion; the user simply points the device at a light fixture to select it, then rotates his or her hand to control the dimming level.

Lighting researchers have also tested subjects' perception of lighting in a virtual environment to assess the possibility of controlling lighting in a dimension other than brightness, and findings suggest that human perception of lighting is in fact also explained by other variables; these data are now being used to design polychromatic control systems that adjust in response to changing environmental lighting conditions and users’ requirements of color and intensity.

To integrate their techniques for lighting and gesture-based control with  issues in climate control, researchers have developed a personal sensing and control device in the form of a wristband they call WristQue. With sensors moved off the walls of a room and onto the people in it, automatic controls are able to react to conditions where people actually are rather than arbitrary locations on the wall.  WristQue can also identify and locate individual users wherever they are in the building, so the building systems can adjust conditions throughout to suit those who are currently using any given space.

Another building-wide sensing system, called the Virtual Messenger, is designed to pass information between the virtual and physical worlds.  Through the use of the Media Lab’s Glass Infrastructure system – thirty touch-screens placed strategically throughout the complex – users who opt into the program can be tracked throughout the building by a multimodal sensor network. When they approach one of the Glass Infrastructure displays, they are met by their Virtual Personal Assistant (VPA) who acts as a mediator between the digital and physical worlds; users can interact with their VPA via hand gestures, allowing them to read any pending messages, or they can tell their virtual avatar not to bother them until later. 



 To help with coordinating all these sorts of sensor systems, the Responsive Environments Group has created Doppellab, a virtual environment that represents the multimodal sensor data produced by a building and its inhabitants. Using the popular game engine Unity 3D, they have created architectural models of SA+P’s new Media Lab complex that visualize streams of sensor data in real time. 

Audio levels are represented by the extension of green-to-red colored ‘snakes’ that undulate faster when increasing motion is detected nearby.  Flames correspond to temperature sensed by built-in thermostats – redish for hotter rooms, bluer for cooler, with an overlaid pulsating sphere when the thermostat isn’t regulating – and a temperature-and-humidity sensor network grid in E14’s large atrium is visualized as color-changing fog and cubes.  By visualizing all this data at once in real time, Doppellab provides a platform for optimal monitoring and control of multiple sensor systems in a complex environment – a tool that will be needed when data from any and all sensors will be used for many purposes, agnostic of the particular device or application to which they are primarily attached.  The group’s follow-on project, the Tidmarsh Living Laboratory, is a collaboration with former MAS researcher Glorianna Davenport to instrument a retired cranberry bog in Plymouth with hundreds of sensor nodes (including microphones and cameras) and bring the data together in an analogous game-engine environment to explore new frontiers in remote presence and connect with nature via virtual real-time immersion.

Such data also provides for some fun. The Data-Driven Elevator Music project transmits audio streams and real-time measures of motion, temperature, humidity and light levels to the building’s glass elevators, composing them into a sound installation.  With audio streams positioned to simulate real locations in the building, visitors would be able to hear the activities surrounding them as they move from floor to floor.

The ongoing work of the Responsive Environments Group extends far beyond the projects mentioned here to include dozens of applications in low-power sensing, printed electronics, smart tools, medical instrumentation, RFID, wearable computing and interactive media.  For more information on any of their work, consult the group’s website (http://resenv.media.mit.edu) or contact group director Joe Paradiso at joep@media.mit.edu.