Like many connected with IBM as an employee, a customer, or an analyst, I watched IBM's Watson beat two smart humans in three games of Jeopardy.  However, I was able to do so under more privileged conditions than sitting on my couch.  Along with my colleague John Rymer, I attended an IBM event in San Francisco, in which two of the IBM scientists who had developed Watson provided background on Watson prior to, during commercial breaks in, and after the broadcast of the third and final Jeopardy game.  We learned a lot about the time, effort, and approaches that went into making Watson competitive in Jeopardy (including, in answer to John's question, that its code base was a combination of Java and C++).  This background information made clear how impressive Watson is as a milestone in the development of artificial intelligence.  But it also made clear how much work still needs to be done to take the Watson technology and deploy it against the IBM-identified business problems in healthcare, customer service and call centers, or security.

The IBM scientists showed a scattergram of the percentage of Jeopardy questions that winning human contestants got right vs. the percentage of questions that they answered, which showed that these winners generally got 80% or more of the answers right for 60% to 70% of the questions.  They then showed line charts of how Watson did against the same variables over time, with Watson well below this zone at the beginning, but then month by month moving higher and higher, until by the time of the contest it was winning over two-thirds of the test contests against past Jeopardy winners.  But what I noted was how long the training process took before Watson became competitive — not to mention the amount of computing and human resources IBM put behind the project.

As an analyst who has been promoting the concept of Smart Computing, I am interested in Watson as a step toward the truly smart computer — like Hal in 2001: A Space Odyssey, or Data in Star Trek, who can interact with humans as if it were a person.  Of course that still remains far off in the future.  What is more timely is Watson's implications for deploying real-time analytical tools to interpret human language (both spoken and written), define and understand a question, and generate a relevant answer.  Much of the technology that IBM built for Watson can be deployed against other types of tasks besides winning a Jeopardy game, to make solutions for these tasks "smarter."   This technology addresses all of the five A's of smart computing that we have identified, that is, Awareness, Analysis, Alternatives, Actions, and Auditability. 

However, my coverage of automated spend analysis products has made me conscious of the challenges the IBM faces in applying Watson's AI technology against other business problems.  Automated spend analysis, products start with a set of neural network, artificial intelligence, and rules engines to cleanse, normalize, and classify raw data on enterprise spending from invoices, purchase orders, and other data sources.  These algorithms are necessary conditions for having a competitive product, but they are not sufficient conditions.  What is really needed for a product to be competitive is a knowledge base that understands what specific information in an invoice or purchase order translates into in terms of product category, vendor, etc.  That knowledge base is only built up over time, through viewing, analyzing, and classifying hundreds of thousands or millions of invoices and purchase orders.  As the IBM scientists made clear, Watson had to build up its own knowledge base of Jeopardy-related information over a long period of time and many trials.  As IBM attempts to apply the Watson technology to address other business problems, it will have to go through similar lengthy exercises of building up the knowledge base relevant to those business problems.

This situation has three implications for competitors to IBM in advanced analytics and artificial intelligence. 

First, they do have time to catch up and match Watson's technological sophistication, because of the learning curve that IBM will have to follow to develop commercial applications based on Watson.  The tools and techniques that IBM developed to allow Watson to win Jeopardy will leak out to other vendors, who will also be looking for ways to reverse engineer what IBM did and to identify even better tools and techniques to accomplish the same goal.  IBM does have a lead here, but it is not insuperable.

Second, they don't have time to wait if they want to build comparable knowledge bases to what IBM has and will start to build in other areas.  Knowledge bases take time and effort to establish, and there are few shortcuts.  Once one vendor gains a lead in knowledge base, it is hard for competitors to catch up, because they have to travel the same learning curve but at a later point in time. 

Third, they should study the business and industry problems that IBM will focus Watson's technology on, then start looking for other business and industry problems.  Should IBM take the lead in building knowledge bases in healthcare diagnosis, for example, that would give it only limited advantage in building a knowledge base in legal matters as an alternative example.  There are many, many industries that need smart computing solutions, so there is no need for IBM competitors to follow its path in terms of where it deploys Watson's technology to build smarter solutions.