Our Gesture Controlled 3D Computing Future: Beyond Leap Motion
Today saw the release of Leap Motion, the 3D gestural navigation controller for PCs and Macs. Like its cousin the Xbox Kinect, Leap Motion uses sensors to track physical gestures. Where Kinect tracks your entire body, Leap Motion tracks fine movements of the arms, hands, and fingers. In turn, this allows users to input information, enabling touch-free 3D gestural navigation control.
Leap Motion can be used to navigate operating systems (Windows, Mac), to cruise through Google Earth, to draw a digital picture, to generate experimental music, or to dissect a virtual frog, as seen in the AirSpace Leap Motion app store. In the future, surgeons could perform surgeries and airline pilots could control their plans with this solution, according to the vendor.
The success or failure of Leap Motion will derive from the strength of the app ecosystem that grows up around it:
- As with touch screen, ground-up applications work best… “Touch-first” applications – those reimagined from the ground up with touch as the primary navigational method – generally appeal to users better than “touch-second” experiences where touch was added to an existing application. Similarly, gesture-controlled experiences need to be rethought from the ground up.The same is true for voice-controlled apps. Developers will need to change the way they work in coming years, collaborating with designers and experts in human anatomy, for all of this to work. Until that happens, the technology will remain marginal.
- …and specialized applications will determine the ultimate business value. Leap Motion’s CEO recently pointed out that the Leap isn’t binary like a mouse, which can only be clicked or not clicked. As such, the range of fluid motions allows designers to potentially create completely new experiences. Some of the long-famous gestural controls popularized in Minority Report may come to light, but a possibly greater opportunity involves workers using customized apps to solve company-specific problems (be they architectural design, industrial assembly, surgery, flight of unmanned planes, or control of robots). Not all of them will involve direct mimicry ("the computer navigation exactly matches my physical motion"); instead, some might be metaphorical, more like a shorthand.
Beyond Kinect and Leap Motion, Samsung has already integrated gestural controls into a truly mass market offering, the Galaxy S4 smartphone. The built-in sensor brings gestures to a larger market with numerous controls, as seen in this video. Startup vendors are trying to reinvent human/computer interaction, as covered in this report.
So what’s next? The designer Jinha Lee gave an amazing TED Talk in February, 2013 (note: only 5 minutes long) describing our gestural computing future:
Spacetop – Lee’s gesture-controlled 3D desktop OS – creates a 3D digital space into which a user inserts her hands, allowing the manipulation of digital objects. Lee’s levitating “tangible interface” programs a magnetic ball into a “physical pixel,” allowing a computer to record the pathways taken by physical forces and to duplicate them with a physical object.
Also interesting – particularly for retail environments – is the augmented reality (AR) demonstration that combines a smartphone and AR glasses. A user can surf a mobile web site and “see” what the goods look like; in the video, a shopper can instantly picture a watch on his own arm by clicking a photo. One could imagine similar capabilities from a future-version Google Glass wearable computer. In this scenario, gestures and 3D projection help eliniate the lines between the physical and digital worlds.
Finally, Disney Research's Aireal creates haptic tactile experiences by generating physical forces with compressed air. In combination with gestural controls, Aireal’s free air sensations – delivered in vortex rings – give users a deeply immersive computing sensation. Aireal reverses the direction of feedback from physical gestures that control a computer to the delivery of computer-generated physical sensations.
Infrastructure professionals should keep one eye on these longer-term trends. As touchscreens have proved (in 6 years) and Leap Motion showed today, such disruptions can become mainstream quickly.
J. P. Gownder is a vice president and principal analyst at Forrester Research serving Infrastructure & Operations Professionals. Follow him on Twitter at @jgownder.