Our Real Time Ambient Display at OpenWorld

One month before we entered the OAUX Exchange tent at OpenWorld, Jake (@jkuramot) challenged us to come up with a visualization “that would ambiently show data about the people in the space.”

A view of the Apps UX Exchange Tent at OpenWorld 2016

A view of the OAUX Exchange Tent at OpenWorld 2016

Mark (@mvilrokx), Noel (@noelportugal) and I accepted the challenge. Mark put together the Internet of Things ultrasonic sensors, Noel created a cloud database to house the data, and it fell to me to design and create the ambient display.

An ambient display is the opposite of a dashboard. A dashboard displays an array of data in a comprehensive and efficient way so that you can take appropriate actions. Like the dashboard of a car or airplane, it is designed to be closely and continuously monitored.

Ambient displays, in contrast, are designed to sit in the background and become part of the woodwork, only drawing your attention when something unusual happens. They are simple instead of complex, unified instead of diverse, meant for glancing, not for scanning.

This project was not only a chance to design an ambient display, but also a chance to work with master makers like Mark and Noel, get my feet wet in the Internet of Things, and visualize data in real time. I’ve also long wanted to make an art installation, which this sort of is: an attractive and intriguing display for an audience with all the risks of not really knowing what will happen till after the curtain goes up.

My basic concept was to represent the sensors as colored lines positioned on a simplified floor plan and send out ripples of intersecting color whenever someone “broke the beam.” Thao (@thaobnguyen) suggested that it would be even better if we could see patterns emerge over time, so I added proportion bars and a timeline.

Since we only had a few weeks we had to work in parallel. While Mark and the rest of the team debated what kind of sensor to use, my first task was to come up with some visuals in order to define and sell the basic concept, and then refine it. Since I didn’t yet have any data, I had to fake some.

So step one was to create a simulation, which I did using a random number generator weighted to create a rising crescendo of events for four colored sensor beams. I first tried showing the ripples against a white background and later switched to black. The following video shows the final concept.

Once Mark built the sensors and we started to get real data, I no longer needed the simulation, but kept it anyway. That turned out to be a good decision. When it came to do the final implementation in the Exchange tent, I had to make adjustments before all four sensors were working. The simulation was perfect for this kind of calibration; I made a software switch so that I could easily change between real and simulated data.

The software for this display did not require a single line of code. I used NodeBox, an open source visual programming tool designed for artists. It works by connecting a series of nodes. One node receives raw cloud data from a JSON file, the next refines each event time, subtracts it from the current time, uses the difference to define the width of an expanding ellipse, etc. Here is what my NodeBox network looks like:

The NodeBox program that produced the ambient video display

The NodeBox program that produced the ambient video display

One challenge was working in real time. In a perfect world, my program would instantly detect every event and instantly respond. But in the real world it took about a second for a sensor to upload a new row of data to the cloud, and another second for my program to pull it back down. Also, I could not scan the cloud continuously; I had to do a series of distinct queries once every x seconds. The more often I queried, the slower the animation became.

I finally settled on doing queries once every five seconds. This caused an occasional stutter in the animation, but was normally not too noticeable. Sometimes, though, there would be a sudden brief flash of color, which happened when an event fired early in that five-second window. By the time I sensed it the corresponding ripple had already expanded to a large circle like a balloon about to pop, so all I saw was the pop. I solved this problem by adjusting my clock to show events five seconds in the past.

Testing was surprisingly easy despite the fact that Mark was located in Redwood Shores and Noel in Austin, while I worked from home or from my Pleasanton office. This is one of the powerful advantages of the Internet of Things. Everyone could see the data as soon as it appeared regardless of where it came from.

We did do one in-person dry run in an Oracle cafeteria. Mark taped some sensors to various doorways while I watched from my nearby laptop. We got our proof of concept and took the sensors down just before Oracle security started getting curious.

On the morning of the big show, we did have a problem with some of the sensors. It turned out to be a poor internet connection especially in one corner of the tent; Noel redirected the sensors to a hotspot and from then on they worked fine. Jake pitched in and packaged the sensors with hefty battery packs and used cable ties to place them at strategic spots. Here is what they looked like:

Three of the ultrasonic sensors we used to detect movement in the tent

Three of the ultrasonic sensors we used to detect movement in the tent

The ambient display ran for three straight days and was seen by hundreds of visitors. It was one of the more striking displays in the tent and the simple design was immediately understood by most people. Below is a snapshot of the display in action; Jake also shot a video just before we shut it down.

It was fun to watch the patterns change over time. There would be a surge of violet ripples when a new group of visitors flooded in, but after that the other colors would dominate; people entered and exited only once but passed across the other sensors multiple times as they explored the room. The most popular sensor was the one by the food table.

One of our biggest takeaways was that ambient displays work great at a long distance. All the other displays had to be seen closeup, but we could easily follow the action on the ambient display from across the room. This was especially useful when we were debugging the internet problem. We could adjust a sensor on one side of the room and look to the far corner to see whether a ripple for that sensor was appearing and whether or not it was the right color.

A snapshot of the ambient display in action

A snapshot of the ambient display in action

It was a bit of a risk to conduct this experiment in front of our customers, but they seemed to enjoy it and we all learned a lot from it. We are starting to see more applications for this type of display and may set up sensors in the cloud lab at HQ to further explore this idea.

AboutJohn Cartan

I am a designer, inventor, and writer currently working as a Senior Design Architect in the Oracle User Experience Emerging Technologies group.

4 comments

  1. @Intelpub/Gustavo: Gracias por el enlace– super interesante, su perspectiva sobre Apps-UX @ OOW.

    @AppsLab team: Do John (using Google Translate) or Noel/Osvaldo/MDC team (using their native-Spanish-speaking brains) have thoughts on the blog post linked to Comment 1 (above)? I am just curious. 🙂

  2. Gustavo, Joyce,

    Thanks to Intelpub for that detailed, first-person account of what it’s like to vist OpenWorld. Those of us putting on our small parts of the show are so busy we don’t have time to see most of the conference. It was fun to see it through your eyes.

    I’m glad you enjoyed the UX Exchange tent and I hope we’ll see you again next year.

    John

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.