Editor’s note: Today, I present to you a detailed review of the recently released Leap Motion controller from one of my colleagues in Applications User Experience, John Cartan. You can read about John’s design work and background on his personal website, which has been live since 1995.
John’s recent work has focused on design for the iPad, which he uses almost exclusively, even for content creation, including this review. I hope he uses an external keyboard.
I’ve had the pleasure of speaking with (some might say, arguing with) John on a usability panel, and I’m pleased to present his review here. Look for my own Leap Motion review sometime soon. Anthony might chime in with one of his own too. So, stay tuned.
Initial Impressions of Leap Motion Device
John Cartan, Applications User Experience, Emerging Interactions
Summary: A fascinating, well-built device but often frustrating and confusing to use. Leap Motion is worthy of continuing study but is not yet a paradigm-shifting device.
I have now spent one day with the Leap Motion device, both at work (in our San Francisco office on a Windows machine) and at home (using my Mac laptop). I downloaded and tested five apps from the Airspace Store, played with about a dozen apps altogether, and read through reviews of many others. I also went through Leap’s tips and training videos and read their first newsletter.
General Impressions
Leap Motion is fun to try, but often frustrating to use – more of an interesting gimmick than a solid working tool that could replace a mouse or touchpad 24-7. Barriers to adoption include confusing gestures, invisible boundaries around the sensory space, unreliable responses to input, and poor ergonomics.
Gestures and responses
As David Pogue reported in his review, every app employs a completely different set of gestures that cannot be discovered or intuited without training and which are hard to remember from session to session. The variety of gestures was impressive; the New York Times app uses a twirling motion to scroll a carousel reminiscent of old hand-cranked microfilm viewers. A Mac macro app called BetterTouchTool claims to support 25 different gestures.
One recurring problem with many gestures were that they rely on the invisible boundaries (or invisible mid-point) of the volume of space the sensor can detect. The general-purpose Touchless app (which lets you substitute your hands for a mouse to browse the web, etc.), uses half of this invisible volume for hovering or moving the cursor, the other half for clicking and dragging; even with good visual feedback it’s hard to find the dividing line between the two. It’s also easy to inadvertently wander outside the boundary when you change positions, or accidentally move the sensor.
Even when you remember the gestures, an app’s response can be unreliable. Although the device can theoretically detect the positions of all ten-fingers, in practice it often confuses two fingers with three, or fails to register some details due to bad lighting, one part of the hand obscuring another, etc. In almost every app, even after repeated practice, my gestures were frequently misinterpreted.
Part of the problem lies not in the device, but in the software written for it. There are countless ways of using the data generated and an art to interpreting them in reliable ways that take account of the things people do with their hands without even realizing it.
A good case in point is Google Earth. At its best, the experience of flying over the earth with your hands is magical – different and more intuitive that using a mouse or even a good tablet app (which simulates moving a large, expandable map, not actually flying over a surface). But Google Earth’s controls for Leap were far too touchy – even when I set the sensitivity to its slowest setting. The tiniest movement of a finger would rocket me a thousand miles up and send the earth spinning like a top. This is a problem that could be fixed with better algorithms. The current apps seems to use absolute, not relative positioning, with little or no attempt to dampen sudden changes or limit the earth’s spin.
Ergonomics
When sitting at a desk using Leap Motion, my arms began to hurt within five minutes; other volunteers reported this as well. This is the same problem we found with our smart board testing: supporting your arms in space is tiring – and soon becomes surprisingly painful.
Recognizing this, Leap Motion’s first newsletter contained a brief tutorial on ergonomics. But it seemed more focused on the needs of the device (to avoid obstructions to the signal) than on the needs of the user. For example, Leap says to avoid the most natural and comfortable positions (e.g. resting on your elbows) in favor of holding your arms straight out into space.
I did experiment with using Leap Motion from a reclining position, as shown in the photo.
Positioning the device on the front part of my laptop worked fine. By projecting my display to a large TV and using a third-party app, I could even close the laptop and position the sensor on its lid or remove the laptop altogether and position the sensor on a side table so that my hand could rest on a chair arm and always remain lower to the ground than my heart (which reduces pain and effort over time). With a little fiddling this worked well during an extended session (as long as I stayed in one place). You can also use the Leap control panel to adjust the required height of your hand(s) over the device. Even so, ergonomics will remain a significant concern.
Recommendations
The best experience I had was with the free Molecules app. The Leap is well-suited to true three-dimensional input like turning a complex molecule in space. This app used clear, simple gestures like a clenched fist to pause, and tolerated random movements without completely disrupting the display. This gives me some hope that with the right gestural algorithms and a good use case, Leap can actually provide some added benefits.
For the most part, though, using Leap Motion was an exercise in frustration. The initial productivity apps, like photo manipulation or Powerpoint presentation, are more likely to produce rage than improved productivity. David Pogue summed up the problem nicely:
“Word of advice to potential app writers: If your app involves only two-dimensional movements, of the sort you’d make with a mouse, then the Leap adds nothing to it. Rethink your plan. Find something that requires 3-D control — that’s what the Leap is good for.”
Leap Motion is a solution looking for a problem and has yet to find its killer app. For most enterprise applications it would be a gimmick that would quickly grow annoying and would be unsuited for daily usage.
There may, however, be some niche applications which might benefit from Leap Motion or something like it. To succeed these use cases would require or at least benefit from true 3-D control, would not rely solely on the Leap to perform destructive or non-reversible actions, and could be performed from a sitting or standing position near a desk or lectern (not mobile). I could also see using other forms of this technology to supplement rather than replace current sensing systems (e.g. detect hover positions above a tablet surface).
We should keep our eyes out for niche use cases and continue to monitor future developments. But there is no need to toss the mouse, trackpad, or touch surfaces just yet.
Update: John has an update from Leap’s PR team in comments pointing to their ergonomic support guide. The net: Tyrannosaurus Rex arms, not zombie arms.
For example, Leap says to avoid the most natural and comfortable positions (e.g. resting on your elbows) in favor of holding your arms straight out into space.
Shameless plug here (what can I say, I want to make sure all LEAP users are at least aware of my app) but I specifically designed DexType with this issue in mind. I used a velocity-based system of gesture recognition (rather than position-based, as Touchless uses) to ensure that users can keep their elbows resting on a flat surface at all times.
The creation of a “virtual touchscreen” should have been a non-starter because our arms really aren’t built to move straight forward in the Z-axis. And it’s impossible to move straight forward in the Z direction when you keep your elbow resting on a surface, your hand will be dragged down by the lever action of your elbow every single time.
@Zack: Stay tuned. I used DexType a bit the other day and will include it in my Leap dump next week.
John here. Leap Motion’s PR team asked if I could refer to their ergonomic support doc, which is here:
http://support.leapmotion.com/entries/24916583-Comfortable-use-of-your-Leap-Motion-Controller
I may have been a bit too glib in my characterization of their guidance. They say *not* to hold your arms straight out into space, but rather hold them out with elbows bent if standing up. If sitting down they do allow resting your arms on your elbows as long as you point your wrists and hands in roughly a straight line.
I did try to follow this guidance when testing the device, but found that some of the positions most natural to me were the ones I was asked to avoid, and even when I did sit as recommended I still found my arms hurting after extended use. Other people during our session reported the same.
Leap Motion’s suggestions are helpful, but I stand by my assertion that ergonomics will remain a significant concern for people using this device.
@John: I’ll update the post. The net: Tyrannosaurus Rex arms, not zombie arms.