This past week’s Forrester Enterprise Architecture (EA) Forum was quite an excellent experience. Being new to Forrester, it was a splendid opportunity to introduce myself to our customers, engage them in face-to-face inquiries, and present my research priorities. Not being new to the analyst space, it was also a chance for me to re-introduce — hence recontextualize — myself, and my focus areas, within the Forrester universe of research client groups, orbits, and domains.
As far as “what I cover” elevator pitches go, mine is two-thirds nailed down, not requiring any excess breath to explain. First and foremost, within the Information and Knowledge Management (I&KM) orbit, I’m Forrester’s lead analyst for Data Warehousing (DW). Secondly, I’m also the lead analyst on Predictive Analytics and Data Mining, an area that clearly plays into my DW coverage (analytic data marts, for example).
But my third focus area is more of a mouthful. In my one-on-ones at last week’s forum, I had to resort to drawing messy diagrams full of acronym balloons and wayward connecting arrows to contextualize it in the larger Forrester — and enterprise architecture — schemes of things. Here now comes another attempt, purely verbal this time, not necessarily elevator-pitch-perfect (still needs intensive nuancing), but hopefully delivered in digestible morsels that Forrester customers will find thought-provoking and inquiry-generating.
My third lead focus area is Complex Event Processing (CEP) for I&KM. At the EA Forum, I was quick to point out that Forrester already has a lead analyst for CEP: Charles Brett, who approaches the topic from an Application Development and Program Management perspective and has just published an excellent report on CEP.
As the “for I&KM” tag indicates, I’ll be looking at CEP in a slightly different context: as an enabler for truly real-time BI, predictive analytics, and business performance optimization. Indeed, most CEP applications rely on the BI portfolio of interactive visualization, dashboarding, scorecarding, reporting, query, predictive analytics, and data mining tools to support agile response and proactive coordination around real-time, emerging, or breaking business opportunities and threats. Behind it all is a low-latency middleware fabric that enables continuous monitoring, aggregation, correlation, and filtering of event data captured from operational applications, business process management systems, databases, and other sources.
So far, CEP has had only a minimal footprint in the BI arena, mostly because BI applications still rely primarily on historical data that has been consolidated in DWs, hence provide, at best, "near real-time" refreshes. Where enterprises have ventured into truly subsecond-latency CEP, they have traditionally implemented is as a stovepipe separate from their BI environments. CEP infrastructures typically incorporate their own, distinct, event-optimized service layers for interactive visualization, dashboarding, modeling, repository, rules engine, resource connection, and administration.
However, it’s only a matter of time before most BI vendors partner with CEP pure-plays, or acquire them outright, in order to strengthen their real-time functionality. We expect to see SAP/Business Objects, SAS, IBM/Cognos, Oracle/Hyperion, Microsoft, Information Builders, and MicroStrategy venture into the CEP arena in the next 1-2 years. Likewise, it’s very likely that a now-independent Teradata, which has taken the lead in real-time DW, will snatch up a CEP vendor to build out its real-time BI/DW portfolio. IBM’s recent acquisition of CEP pure-play AptSoft shows that it is serious about CEP for I&KM, as does TIBCO’s acquisition of Spotfire.
Rest assured: I won’t cover CEP as a stovepipe. You can expect to see CEP — as an enabler for real-time analytics and business optimization — incorporated as a cross-cutting theme throughout my Forrester coverage areas.