BI Is Dead! Long Live BI!

 

Executive Summary

We suggest a dozen best practices needed to move Business Intelligence (BI) software products into the next decade. While five “elephants” occupy the lion’s share of the market, the real innovation in BI appears to be coming from smaller companies. What is missing from BI today is the ability for business analysts to create their own models in an expressive way. Spreadsheet tools exposed this deficiency in BI a long time ago, but their inherent weakness in data quality, governance and collaboration make them a poor candidate to fill this need.  BI is well-positioned to add these features, but must first shed its reliance on fixed-schema data warehouses and read-only reporting modes. Instead, it must provide businesspeople with the tools to quickly and fully develop their models for decision-making.

 

Why BI Must Transition From The Past Century

In this avant-garde era of Big Data, cloud, mobile and social, the whole topic of BI is a little “derriere.” BI is a phenomenon of the previous two decades, but the worldwide market for BI tools (not including services or surrounding technologies such as data warehousing and data integration) is greater than $10 billion per year. It’s still a very significant market and will, for some time, dwarf spending on the Big Data top gun, Hadoop, which is an open source distribution.  The lion’s share of revenue in the Big Data market will continue to be hardware and services, not software, unless you consider the application of existing technologies, especially database and data integration software, as part of Big Data.

Because BI is still alive, it’s worth revisiting some concepts I wrote about a few years ago — abstraction and model-driven design in BI. In the proto-BI days, when it was still known as Decision Support Systems (DSS), high-level declarative languages were used to create business models that could handle user-entered (or background-loaded) data. These systems were interactive, allowing for what-if analysis, testing sensitivity of the models and even the application of advanced statistical techniques.

Data warehousing changed all of that. Performance of relational databases for interactive   reporting from static schema was a challenge. Adding data to the warehouse interactively through iterative analysis was out of the question. Adding entities to the schema to accommodate new models required data modeling and schema changes, often reloading data and associated testing before implementation.

As data warehouses became the preferred way to provide data for reporting and analysis, BI vendors that focused on the read-only nature of data warehousing prospered and dominated the field. Business modeling fell from favor, or, more precisely, fell to Excel. The only exception was found in budgeting and planning packages, mostly based on Multi-Dimensional Online Processing (MOLAP) databases, predominantly Essbase and Microsoft BI.

 

The Era Of Big Data Means Business Analytics

The major BI vendors of the late 1990s through today are BusinessObjects (SAP), Cognos (IBM), SAS, Hyperion (Oracle), Microsoft and Microstrategy. At these six vendors (who together comprise more than two-thirds of the BI market), only about 20 percent of combined revenue came from tools that provided modeling capabilities. The rest came from strictly read-only data warehouses and marts[i].

Now that the era of Big Data is here, modeling has to move up a notch. While newer technologies are in play for capturing and massaging Big Data, the need for business analytics is greater than ever. In the past, the BI calculus more or less ended with informing people. Today, BI must enable actions and decisions that are supported by deep insight. Excel may be a good container for interacting with models, but its internal capabilities aren’t sufficient. The need for BI tools will not disappear, but it’s time to break the read-only mold.

 

Moving From Data To Decisions Through Business Modeling

Business modeling can be an imprecise term. But in general, it means creating descriptive replicas of a part of a business — such as assets, processes or optimizations — in terms that are consonant with the people (and processes) that use them. Usually, the goal is to render these models into computer-based applications. In general, a business model is created by someone who has certain knowledge about a process or function in the business. This can range from a single fact, such as how operating cash flow is calculated, to something as broad as how the manufacturing plants operate.

To be effective, businesspeople need more than access to data: They require a seamless process that lets them interact with the data and drive their own models and processes. In today’s environment, these steps are typically disconnected and, therefore, expensive and slow to maintain without the data quality controls from the IT department. The solution: a better approach to business modeling coupled with an effective architecture that separates physical data models from semantic ones. In other words, businesspeople need tools to address physical data through an abstraction layer that allows them to address only the meaning of data — not its structure, location or format.

Abstraction is applied routinely to systems that are somewhat complex and especially to systems that frequently change. A 2012 model car contains more processing power than most computers only a decade ago. Driving the car, even under extreme conditions, is a perfect example of abstraction. Stepping on the gas doesn’t really pump gas to the engine, it alerts the engine management system to increase speed by sampling and alerting dozens of circuits, relays and microprocessors to achieve the desired effect. These actions are subject to many constraints, such as limiting engine speed and watching the fuel-air mixture for maximum economy or minimum emissions. If the driver needs to attend to all of these things directly, he would not get out of the driveway.

Data warehouses and BI tools still rely on at least some of the business users’ understanding of the data models and semantics, and sometimes the intricacies of the crafting of queries. This is a huge barrier to progress. Businesspeople need to define their work in their own terms. A business modeling environment is needed for designing and maintaining structures, such as data warehouses and all of the other structures associated with it. It is especially important to have business modeling for the inevitable changes in those structures. It is likewise important for leveraging the latent value of those structures through analytical work. This analysis is enhanced by understandable models that are relevant and useful to businesspeople.

BI isn’t standalone anymore. In order to close the loop, it has to be implemented in an architecture that is standards-based. And it has to be extensible and resilient, since the boundaries of BI are more fuzzy and porous than other software. While most BI applications are fairly static, the ones of most value to companies are the flexible and adaptable ones. In the past, it was acceptable for individual “power users” to build BI applications for themselves or their group without regard to any other technology or architecture in the organization. Over time, these tools became brittle and difficult to maintain because the initial design was not robust enough to adapt to the continuous refinements and changes needed by the organization. Because change is a constant, BI tools need to provide the same adaptability at the definitional level that they currently do at the informational level. The answer is to provide tools that allow businesspeople to model.

 

Business Modeling Emerges As a Critical Skill

All businesspeople use models, though most of them are tacit or implied, not described explicitly. Evidence of these models can be found in the way workers go about their jobs (tacit models) or how they have authored models in a spreadsheet. The problem with tacit models is that they can’t be communicated or shared easily. The problem with making them explicit is that it just isn’t convenient enough yet. Most businesspeople can conceptualize models. Any person with an incentive compensation plan can explain a very complicated model. But most people will not make the effort to learn how to build a model if the technology is not accessible.

There are certain models that almost every business employs in one form or another. Pricing is a good example and, in a much more sophisticated form, yield management, such as the way airlines price seats. Most organizations look at risk and contingencies, a hope-for-the-best-prepare-for-the-worst exercise. Allocation of capital spending or, in general, allocation of any scarce resource is a form of trade-off analysis that has to be modeled. Decisions about partnering and alliances as well as merger or acquisition analysis are also common modeling problems. Models are also characterized by their structure. Simple models are built from data inputs and arithmetic. More complicated models use formulas and even multi-pass calculations, such as allocations. The formulas themselves can be statistical functions and can perform projections or smoothing of data. Beyond this, probabilistic modeling is used to model uncertainty, such as calculating reserves for claims or bad loans.

When logic is introduced to a model, it becomes procedural. Mixing calculations and logic yields a very potent approach. The downside is that procedural models are difficult to develop with most tools today and are even more difficult to maintain and modify because they require the modeler to interact with the system at a coding or scripting level; most business people lack the temperament or training, or both, to do so. It is not reasonable to assume that businesspeople, even the power users, will employ good software engineering technique, nor should they be expected to. Instead, the onus is on the vendors of BI software to provide robust tools that facilitate good design technique through wizards, robots and agents.

 

Learn From A Dozen Best Practices In Modeling

For any kind of modeling tool to be useful to businesspeople, supportable by the IT organization and durable enough over time to be economically justifiable, it must provide or allow the following capabilities:  

  1. Level of expressiveness. This must be sufficient for the specification, assembly and modification of common and complex business models without code; it should accommodate all but the most esoteric kinds of modeling.
  2. Declarative method. Such a method means that each “statement” is incorporated into the model without regard to its order, sequence or dependencies. The software, freeing modelers to design whatever they can conceive, handles issues of calculation optimization.
  3. Model visibility.  This enables the inspection, operation and communication of models without extra effort or resources. Models are collaborative and unless they can be published and understood, no collaboration is possible.
  4. Abstraction from data sources. This allows models to be made and shared in language and terms unconnected to the physical characteristics of data; it gives managers of the physical data much greater freedom to pursue and implement optimization and to improve performance efforts.
  5. Extensibility.   Extensibility means that the native capabilities of the modeling tool are robust enough to extend to virtually any business vertical, industry or function. Most of the leading BI tools are owned by much larger corporate parents, potentially limiting or directing the development roadmap of the BI offering, in alignment with the wider vision of the parent. Smaller BI and analytics pure-plays tend to have a truer vision about BI (until they are acquired). Because analysts who are thinking out of the box gain valuable insight, the BI tool cannot imposes vendor-specific semantics and functions, or lock analysts in and limit distribution of insights with expensive licenses.
  6. Visualization. Early BI tools relied on scarce computing resources, but with today’s abundance of processing power, an effective BI platform should include visualization. It is a proven fact that single-click visualization of models and results aids in understanding and communicating complicated models.
  7. Closed-loop processing. This is essential because business modeling is not an end-game exercise, or at least it shouldn’t be. It is part of a continuous execute-track-measure-analyze-refine-execute loop. A modeling tool must be able to operate cooperatively in a distributed environment, consuming and providing information and services through a standards-based protocol. The closed-loop aspect may be punctuated by steps managed by people, or it may operate as an unattended agent, or both.
  8. Continuous enhancement. This requirement is borne of two factors. First, with the emerging standards of service-oriented architectures, web services and XML, the often talked-about phenomenon of organizations linked in a continuous value chain with suppliers and customers will become a reality soon and will put great pressure on organizations to be more nimble. Second, it has finally crept into the collective consciousness that development projects involving computers are consistently under-budgeted for maintenance and enhancement. The mindset of “phases” or “releases” is already beginning to fray and forward-looking organizations are beginning to differentiate tool vendors by their ability to enable enhancement with extensive development and test phases.
  9. Zero code. In addition to the fact that most businesspeople are not capable of and/or interested in writing code, there is sufficient computing power at reasonable costs to allow for more and more sophisticated layers of abstraction between modelers and computers. Code implies labor, error and maintenance. Abstraction and declarative modeling implies flexibility and sustainability. Most software “bugs” are iatrogenic; that is, they are introduced by the programming process itself. When code is generated by another program, the range of programmatic errors is limited to the latent errors in the code generator, not the errors introduced by programmers.

10.Core semantic information model (ontology). Abstraction between data and the people or programs that access the data isn’t very useful unless the meaning of the data and its relationships to everything else are available in a repository.

11.Collaboration and workflow.  These capabilities are essential to connecting analytics to every other process within and beyond the enterprise. A complete set of collaboration and workflow capabilities supplied natively within a BI tool is not necessary, though. Instead, the ability to integrate (this does not mean “be integrated,” which implies lots of time and money) with collaboration and workflow services across the network, without latency or conversion problems, is preferable.

12.Policy. This may be the most difficult requirement of them all. Developing software to model business policy is tricky. For example, “Do not allow contractor hours in budget to exceed 10 percent of non-exempt hours.” Simple calculations through statistical and probabilistic functions have been around for over three decades. Logic models that can make decisions and branch are more difficult to develop, but still not beyond the reach of today’s tools. But a software tool that allows business people to develop models in a declarative way to actually implement policies is on a different plane. Today’s rules engines are barely capable enough and they require expert programmers to set them up. Policy in modeling tools is in the future, but it will depend on all of the above requirements.

 

Today’s BI Will Not Be Tomorrow’s BI

It is an open question whether BI has been, in the long run, successful or not. The take-up of BI in large organizations has stalled at 10 to 20 percent, depending on which survey you believe. I believe that expectations of broad acceptance of BI were overly optimistic and that the degree to which it has been adopted is probably at the right level for the functionality it delivered.

Will BI survive? Yes, but we may not recognize it. The need to analyze and use data that are produced in other systems will never go away, but BI will be wrapped in new technologies that provide a more complete set of tools. Instead of managing from scarcity of computing resources, BI will be part of a “decision management” continuum — the amalgam of predictive modeling, machine learning, natural language processing, business rules, traditional BI and visualization and collaboration capabilities.

Pieces of this “new” BI are already here, but within two to three years, it will be in full deployment.


[i] These numbers are representative estimates only and are based on our discussions with vendors and other published information. It is difficult to be precise with BI because the market leaders offer a wide variety of software and services, some for licensing and some embedded in non-BI products. In addition, BI itself has a number of fuzzy definitions. For the purposes of this discussion, we use a definition of query and reporting tools, including online analytical processing (OLAP), but exclusive of data warehousing, extract/transform/load (ETL), analytic databases and advanced analytics/statistics packages.

 Copyright 2012 Neil Raden and Hired Brains Inc

This entry was posted in Uncategorized. Bookmark the permalink.

One Response to BI Is Dead! Long Live BI!

  1. Pingback: BI Is Dead! Long Live BI! | Ragnarok Connection

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s