I’ve met many CIOs, all with their own unique challenges and approaches to overcome them. But despite their differences, all CIOs ask me the same question: “what is the next big technology trend that I should look out for?”

It’s a tough question — not because there is a shortage of emerging tech trends out there. The tough part is whittling down all of trends to the really big ones — I mean the ones that could really change the way we do business. So all through 2009, my answer was: 1) consumerization of IT (what we at Forrester refer to as Groundswell), 2) lean IT, and 3) cloud computing. For those interested, you can still view the Three Tech Movements CIOs Should Know  webinar I did with colleagues Ted Schadler and John Rymer late last year.

But there may be another — one that frankly has been hanging around for years, but hasn’t hit the big time, so to speak: Smart computing (FYI — you can replace “computing” with your favorite second word like “technology” or “grid” but I’ll stick with “computing” for simplicity's sake). What is Smart Computing? Smart Computing extends existing technologies by adding new real-time situation awareness and automated analysis to help firms solve smarter and more complex business problems. My colleague Andrew Bartels recently penned, “Smart Computing Drives The New Era Of IT Growth” which can give you more insight.

Like I said, it’s a concept that’s been around for a while — showing its face in stalled RFID deployments and interesting sensor experiments. But new advances and focus by tech giants like IBM and Oracle could finally push it into the mainstream. Many tech pundits (Forrester not withstanding) have pegged Smart Computing as the next big thing. But has it crossed the CIO’s radar yet? Is Smart Computing on the horizon in your organization? And if so, how will Smart Computing change your role in the organization (if at all)?

I would love to hear your thoughts on Smart Computing and what it means for your role in IT and the business.