The Humanlike Future Of Interactions With Machines
AI is igniting hopes and fears in 2023 like the metaverse did in 2022. And zooming out to see the bigger picture, it becomes clear that although the two differ in many ways, they’re part of — and possibly the culmination of — a major accelerating trend: Interactions between people and machines are becoming more like those between people. I see this happening in two clusters of changes:
- Mind: Generative AI, language AI, and adaptive AI are equipping machines to become more creative, conversational, and perceptive in their interactions with users. Those are attributes of human cognition, so we expect them from each other, but they’re novel in machines.
- Body: Somatic interfaces, 3D environments, and virtual people let people engage with machines in ways that are more embodied, spatial, and personified — like the ways we engage now with the physical world and with human employees, colleagues, friends, and family.
Another way to characterize and contrast these two clusters is cognitive vs. sensory.
Of course, the discipline of human-machine interaction (HMI) has for decades been nudging machines gradually closer to the way humans like to interact and moving away from requiring humans to think and behave like machines — consider the long march through punch cards, keyboards, trackpads, touch screens, voice interfaces, etc. But today’s emerging interaction modalities will bring about more significant changes than all the others combined. And the changes could be good or bad — or a bit of both. Here are a couple of examples:
- Chatbots that assist doctors and nurses but sometimes fabricate. A chatbot based on a generative AI model derived from a corpus of textual content about medical symptoms, treatments, and outcomes could assist healthcare professionals in treating patients. But that same chatbot could fabricate advice (especially if the corpus contained inaccuracies) that, though they may sound truthy, are actually dangerous.
- Headsets that streamline XR object interactions but spy on users. A headset for augmented, mixed, or virtual reality (AR, MR, VR) that is able to detect where its user is looking by tracking their gaze can help deliver a great user experience, since eye-tracking enables innovative interaction mechanics like gaze-and-dwell (so users can select an item just by looking at it). But if the headset manufacturer also uses the data to guess a user’s interests without transparently informing them, they could then be plagued by irrelevant ads for the item they looked at.
What will steer the future of HMI toward delightful experiences or dystopian ones? Most of all, it will be driven by the cumulative effect of your experience design decisions and those of your colleagues and of your peers at organizations like yours. That’s why anyone in a leadership role involved in creating digital experiences in any role (design, development, product, etc.) should be anticipating and preparing for this future now. Here are four ways that you can get involved:
- I’m conducting research right now for a report on the future of HMI. If your company has expertise to share on this topic, submit a briefing request and point to this blog post so that our briefings team knows to route your request to me. Even if you miss the window for this report, there will be more.
- I’ll be presenting the findings from this research at the CX North America conference in June, so join me there to hear about and discuss the issues and what they will mean for you and your organization.
- If you’re a Forrester client and would like to ask me a question about these issues, you can set up a conversation with me.
- You can also follow or connect with me on LinkedIn if you’d like to get updates when I post there on this and related issues.
After all, although Arthur C. Clarke famously claimed that “any sufficiently advanced technology is indistinguishable from magic,” some sorcerers don’t use their powers for good. But you can be one who does — and influence others to do the same.