Yesterday IBM announced its intent to acquire the advanced analytics vendor SPSS for $1.2 billion.
The acquisition fills what had been a void in IBM’s business intelligence capabilities and is a further expansion of its Information on Demand portfolio, with $10 billion in acquisitions over the past several years.
Predictive analytics has been a hot topic for years but one in which there seems to be little consensus on how to bring it from the back room to the front lines. It’s clear that building good predictive models requires a high degree of expertise and is not something that will become main stream any time soon. However, incorporating the results of those models into every day decisions that even the most novice of information users can consume and act upon has become the holy grail. So what does that mean – store everything in the database? Execute at run time and surface in a report?
Not surprisingly, SAS has made the most inroads here by both pushing analytics into the database for faster execution and based on larger data volumes, but also, surfacing the results in standard reports in its BI platform. Of leading BI vendors, Microsoft and MicroStrategy also have good solutions that surface results in either Excel 2007 or a MicroStrategy report, respectively. (see the BI Scorecard Summary report for more information.)
Interestingly, SAP BusinessObjects began shipping an OEMd version of SPSS last spring. The SAP BusinessObjects Predictive Workbench allows the BusinessObjects universe to act as a data source to Clementine modelers. It will be interesting to see if this agreement remains in place following the acquisition’s close. The challenge of course is that beyond SAS and SPSS, the predictive analytics space is predominantly smaller players.
Amidst this announcement, IBM says it was purely coincidental timing that many press and analysts were at their R&D offices attending a briefing on their new “Smart Analytics System” (due out in September) and “Smart Analytics Optimizer” (due out in Q4). The Smart Analytics System is more than an appliance, including the hardware, software, and services optimized for a particular BI deployment, or as IBM calls it “analytic workload”, whether predictive or not. IBM guarantees a 50% lower cost of ownership versus the current approach of buying and deploying components a la carte. A lower cost of ownership and faster deployment time is of course appealing. However, it seemed odd to me that the first versions of the systems only include Cognos as an option, not a standard component. Same applies to ETL capabilities. In this regard, while the vision is enticing, I can’t help but remain skeptical customers will continue to have the degree of flexibility they currently have.
Or has the market gotten to the point that time to value and lower cost of ownership trump flexibility and the desire to mix and match components?