Let me start by stating the obvious: Automation has its best home in the contact center. At many brands, agents make up 10% of the workforce. They are the face of your brand to your customers, providing critical empathy and solving important problems. Agents also spend a lot of time performing rote tasks for customers that really should not require a human. The cost of an automated interaction is generally about a tenth of the cost of a conversation with an agent. This cost differential provides gaudy ROI numbers, prompting brands to implement chatbots and interactive voice response (IVR) systems that harm customer loyalty. Unfortunately, this focus on cost savings often undermines the overall customer experience.

Enter generative AI (genAI) and large language models (LLMs): technology that will deliver natural conversations that are useful to customers but require a fraction of the time and effort to deliver when compared to traditional conversational AI toolsets.

This new AI technology poses an existential threat to conversational AI vendors. Think about the transition from an ice box to a refrigerator — a massive leap in capability and so much easier to use and maintain. Imagine being an ice harvester realizing that, with refrigeration, your industry had become obsolete (thank you, Guy Kawasaki, for this metaphor). The vendors in this wave are not ice harvesters. While there was no amount of retooling that an ice harvester could do to stay relevant in the era of refrigeration, conversational AI vendors have embraced genAI and LLMs to offer much more than was dreamed possible just a year ago.

In the blink of an eye since ChatGPT was announced, the vendors in this wave have all evolved from being companies that could build an individual intent in six weeks or so using traditional NLU technology, to providing these important capabilities:

  • Orchestrating various AI resources, including LLMs and genAI, to ensure the right resources are used at the right time for the best possible customer experience.
  • Providing guardrails such as retrieval-augmented generation (RAG), vectoring, and specific and finely tuned LLMs to make genAI safe for customer-facing interactions.
  • Helping their customers find secure spaces, such as RAG-driven FAQ applications, to begin their genAI self-service journey.
  • Managing transaction workflows to deliver positive customer experiences while meeting business requirements.
  • Future-proofing brands’ self-service offerings with flexible technology that will keep up with the exponential advances of genAI.

The result of this transformation is improved chatbots and intelligent virtual assistants (IVAs) that can be delivered more cost-effectively than ever before, all thanks to the capabilities of LLMs and genAI. This isn’t just a bolt-on; vendors in this wave have completely revamped their offerings by placing LLMs and genAI at their core. It’s remarkable — I’ve never witnessed such rapid reinvention of an entire market space in less than a year.

If you are a contact center manager, it’s time to start studying up. The vendors had to make this change to survive. You don’t need to worry about genAI making your business obsolete, but you can follow their lead and differentiate with a new generation of likeable, useful chatbots that provide real value to your customers and significant cost savings for your organization.

Don’t get me wrong. It’s early days for brands thinking about delivering genAI-powered chatbots or IVAs. This is still bleeding edge stuff. Approaches to mitigating hallucinations and other genAI issues are in their infancy. However, the vendors in this wave all understood that the impact of genAI and LLMs are so powerful that they transformed themselves to embrace these new technologies. Now is the time for you to start taking advantage of what they have built. It’s a new ball game and you need to play.

Click here to read The Forrester Wave™: Conversational AI For Customer Service, Q2 2024.