Last week I was once again hustling through a brutal travel week (10,000 miles in the air and two packed red-eyes) when I came across something really interesting. It was ~ 9 AM and I'd just gotten off AA flight 4389 from Toronto. I was a bit bleary eyed from a 4 AM call with a Finnish customer and was just trying to schlep my way to the Admiral's club for a cup of coffee when I stumbled across Accenture's Interactive Network display at the juncture of terminal H and K.
So what? You might ask, it's just a big screen and we already know our future is minority report -right? Yes – those of us in the echo chamber might know that, but what really struck me was watching my fellow travelers and how they interacted with the display. I sat and watched for about 10 minutes (while forgetting about the sorely needed cuppa joe) and just watched people as they started to walk past, then pause, then go up to the screen and start playing with it. On average folks would stay for a few minutes and read some of the latest news feeds, then hurry on to their next stop. But what I really found intriguing was how they interacted with the system:
- Travelers were drawn to the user interface. If you haven't read Mike Gualtieri's Best Practices In User Experience (UX) Design then you should – it's a great doc. One of the main points Mike makes in the doc is that great user experiences must be useful, usable and desirable. That last element is what I clearly saw at O'Hare. Anytime you can get a busy, harried disinterested traveller to stop in his tracks (look at the guy with the backpack) and pay attention there's some serious desire being created. I saw it happen a half dozen times in the 10 minutes I watched.
- There was no fear when it came to engagement. I'm sure you know people that are afraid of computers and while maybe it's not so much of an issue with the Millennial generation it was interesting for me to watch folks walk right up and start playing with the screen, with no instructions or familiarity with the interface at all. This really speaks to Mike's second criteria, the natural user interface was highly usable. It provided some basic cues that we're used to like selectable windows (rectangles of course, because that's how we interact with the Internet, right?). In no time at all regular folks we're playing with the screen, bringing up multiple data feeds and moving them around – and I'm pretty sure they weren't IT professionals.
- Social interaction extended beyond the screen. Because the screen was massive multi-touch more than one user could interact with it at a time. As two or three people we're working at their own little section, I noticed that several times they began looking at what the other folks we're doing, and even talking about their experience. In effect, some real-time social computing was taking place, not just the faceless kind with the Internet in the middle.
We're early on in what will be a sea change in UI design at least as big as when we went from character-mode to GUIs and mice. But I'm not sure that the average developer realizes what's about to happen to their programming practices. When we surveyed developers in concert with Dr. Dobbs last year, we asked how interested they were in different developer technologies and NUIs were pretty far down the list. It will be interesting to see if this changes with the rush of tablets (and of course the iPad) that we're about to see hit the market.
We've had a few inquires this quarter into NUIs, and whether the time is right to start firing up R&D efforts within large application development shops. In general, I think the answer is "Yes" when it comes to multi-touch, not just because of Mobile devices like iPhone and Android phones, but also because of the native capabilities built into .NET 4.0. As organizations refresh PCs and move toward Windows 7 and .NET 4.0, the number of multi-touch ready devices is about to increase dramatically. And if you're looking for some inspiration about the new types of applications that might be possible, check out Project Gustav, a realistic painting system prototype from Microsoft Research. While it's my own opinion that gesture based computing is a bit further out, and I won't likely have a surface coffee table in my house anytime soon, I could always be wrong (if these folks from MIT have anything to say about it).
Interested in NUIs? Leave a comment and share your opinions. I'm off to watch "Up in the Air" before the next person tells me I have to see it…