You never know what’s coming at you next, which is why process agility is so important. Your organization must have a ready response for anything. And you must make sure that every process participant can identify, at their level, what that response might be, so they can take appropriate action.
I was speaking with Chordiant the other day about the role of predictive analytics in multi-channel customer conversations, and our particular conversation kept coming back to the killer app called “next best offer.” You should always be instrumenting all channels with the predictive logic to produce a steady stream of offers that are best suited for winning each particular customer, keeping them happy, and growing the relationship. Per the emerging best practice, those offers can be auto-generated by predictive models in conjunction with recommendation engines, customer relationship management systems, and business process platforms. And those offers must be tailored to each channel—including self-service portal, call center, and outbound sales force—through which you interact with customers.
Of course that’s all motherhood and apple pie in today’s customer communications management arena. But, as we were talking, I realized that Chordiant’s approach—which some call “decisioning,” “decision automation,” or “decision management”—validates the core criteria that I applied in my soon-to-be-published. Forrester Wave on Predictive Analytics and Data Mining Solutions. In particular, Chordiant’s focus on helping customers develop, deploy, and juggle multiple models—including those that are adaptive, dynamic, and self-learning—speaks to the critical importance of multi-scenario model optimization.
As discussed in my blog post from a couple of months ago, process professionals should be able to build complex models of multiple, linked scenarios across different business, process, and subject-area domains. In a customer-facing environment, you want to build and deploy in unison a range of predictive models for driving such linked processes as inbound call centers, order fulfillment, and billing. Just as important as the models themselves is the need to keep them continually optimized for maximum business benefit. You should instrument transactional applications so that continuously self-optimizing predictive models are always driving the next best actions in each of the linked processes.
Let’s call this the “next best model” approach. Done right, it would leverage such sophisticated capabilities as strategy maps, ensemble modeling , champion-challenger modeling, real-time model scoring, constraint-based optimization, and automatic best-model selection. To the extent that your company does predictive modeling and data mining with multiple vendor and open-source tools, you should be able to converge heterogeneous models into a unified repository with comprehensive metadata, governance, and deployment tools. And you should continue to score all of these models against continuous feeds of fresh information from applications, complex event processing streams, enterprise data warehouses, Web 2.0 environments, and other sources.
It’s clear that dynamic process agility depends on this arsenal of tools, as well as on equipping each operational process participant with the requisite business intelligence (BI), collaboration, and transactional applications for their roles. As we move into the new decade, process modelers will need to start incorporating multi-scenario predictive analytics into their designs in order to maximize agility.
It may make sense for process administrators to participate in the modeling exercise through what-if and simulation tools integrated with their BI environment. I noticed that Chordiant has that capability too. From an agility standpoint, it’s also important that you allow process participants to tweak the predictive logic in a collaborative environment.
As organizations go deeper with predictive process agility, they must democratize the modeling. That’s because you can never tell where the next best process innovation might come from.