I read with interest Beth Schultz’s recent blog about Agile BI and its connection to Agile Integration. I would take it even further, to include nearly all aspects of the information infrastructure – in other words, “Agile Everything.”
Certainly, what Ms. Schultz describes as the benefits of Agile BI (being able to deliver actionable information to decision makers when it is most valuable) requires not only the right report, but all the current data in the warehouse or data mart. However, achieving this goal on an on-going basis requires agility in more than the BI layer and the data integration layer – it also requires an agile data warehouse and even agility in the definition of the entire infrastructure.
This is driven by the one thing that wasn’t mentioned: the data model – or the context in which the information is presented. The one inviolate factor in business is change. Companies and products are acquired, divested of and realigned constantly. This requires that the underlying data structure, which is really the context of the information, be able to change at a rate that keeps pace with the business. Having a report with yesterday’s data does me little good if it doesn’t include the results of a major acquisition that was done last month.
To be fair, Ms. Schultz does mention that leaders in this area do allow for “localized, one-off integration” but what happens when the integration needs to be more than just one-off or local? Major changes in the information needs of an organization frequently extend over multiple subject areas and lines of business. To deal with this level of change requires more than just agile integration. It requires agile requirements gathering, agile needs analysis, agile data modeling and agile project management. It requires a highly automated product suite that allows iterative development very early in the development cycle – during the design phase. To be able to iterate during design requires automation that can deliver prototype results almost instantly. Effectively, you need to be able to do “what if” analysis of your entire BI infrastructure.
How can this be done? Imagine an environment that allows you to simply draw changes to the infrastructure on a diagram, and then have those changes automatically reflected in the table structure, the ETL processes and the data marts. Imagine all that happening without having to hand off designs and specifications between architects, ETL coders and DBAs. Imagine that this also automatically changed your data governance environment – updating the governed information and revalidating the existing data against the new rules, insuring that the information that ultimately lands in a report or an analysis screen is not only technically accurate but also fit for purpose. This is what Kalido customers enjoy today with the Kalido Information Engine.
Can you employ an “agile everything” approach without such a tool? Perhaps, but it is extraordinarily difficult to manage the hand-offs that must occur when using more traditional tools. Lacking the automation of a single integrated environment, the length of each iteration will necessarily become longer, which will make it harder to manage scope. Ultimately, it stops being truly iterative and becomes “mini-waterfall” instead.
I’d love to hear other people’s experiences with these kinds of projects.