In our recent blog post, Connecting, Collecting and Understanding Data, we discussed the importance of semantic integration and focusing on the business view of data to help ensure key information isn’t lost during project development.
Too often, “disconnects” between IT and the business can delay or derail a master data management or data warehouse project, and/or cause significant cost overruns. The Magnitude Business Information Modeler tool (BIM) promotes semantic integration because it models the way data is consumed by the business, not the way it is stored. In this article, we will review 10 areas of disconnect these types of projects often encounter, then we will discuss how BIM, with its business information model driven approach, minimizes these disconnects and drives project development.
#1: Misunderstood Business Requirements
Business representatives usually document their requirements in business terms with lists, diagrams, report descriptions and conceptual diagrams. However, because of unclear definitions and inconsistent business rules, there can still be significant disagreement and conflict between business users. Business analysts shouldn’t expect 100 percent accurate requirements in the first round, since trying to finalize requirement definitions in one cycle is unrealistic.
#2: Logical Model Agreement
The data architect combines the initial understanding of business requirements (however incomplete) with analysis of the source data to create a “logical data model”. The data model is generally documented using a data modeler tool, then verified with the business representatives.
But there’s a problem here. This is a data model, and most business users will need some form of explanation about what they are viewing in these technical diagrams. Already unsure about the details of their business requirements, the business representatives tend to make minimal and corrections, then “agree” with the data architect on the logical model so that the project can progress.
#3: Understanding Physical Design
The logical model is converted to database schema designs. Then the database designers take on the physical design.
The architect might apply some design best practices which are different from the logical model communicated to and agreed with the business representatives. Many of the design techniques are so complex that the IT professionals have long discussions about them, and nobody can expect the business representatives to understand super technical concepts. The focus shifts to IT issues, and the business representatives once again take a back seat.
Now the first version of the design is sealed in the modeling tool, which generates an impressive database creation script. But both business and IT know that there’s already a semantic gap.
#4: Database Administrators Finetune Design
From here the data architect hands the design responsibility to a separate development team. They build the solution and create the physical schema. DBAs make changes for standardization, performance tuning and naming conventions. Perhaps they adjust the schema for the underlying DBMS and hardware platform.
The business has no visibility into these changes because they are deemed to be just “technical” in nature. However, because the database is now different from what was designed, those changes might have an impact on the originally proposed business requirements.
#5: Design Changes During Development
During the development phase, the team might need to change the design many times, for technical or for business reasons.
Once the initial loading starts, and the tables are filled with actual data, they discover new issues, such as duplicate identifiers or non-standard data values. This typically results in minor changes (like adding a flag column or a new identifier).
Most people agree that the development should follow some sort of iterative method, so issues are raised and resolved with the business representatives. At the same time, the business team might propose changes of its own.
The question is where to apply those changes. Should they be incorporated into the data modeling tool, and the development instance be recreated? This is a matter of change management discipline. Very few projects go back to the data modeler due to time or resource restrictions. Most changes identified during the development phase are applied directly on the development instance. When this occurs, the logical model becomes totally disconnected from the physical database, and there is no longer a logical record that ties to a physical instance.
#6: Data Modeling Tool Offline
With today’s modeling tools, it’s possible to make changes to the logical or physical model, then run a database modification script. However, not all changes are supported, and it is unlikely to work on the development DBMS, where changes have already been made.
In addition, the modeling tool is not aware of any data already in the system. So something might be valid from the model perspective, but existing data prevents the model from deploying.
Data architects usually are not involved in changes made after their initial design, so they’re already disconnected from the “as-built” system. And the business representatives are cut off because IT is speaking a different language.
#7: Semantics in Business Dictionary
Experienced architects and business analyst representatives are keen to capture the meanings of their business information objects and business rules. They usually document these in the data modeling tool or in a separate business dictionary document. Because the end-to-end software platform from sources to BI is so fragmented, very few projects convey these semantic definitions into their development instances.
#8: Inadequate Master Data Management
When data comes from multiple sources, projects always hit data quality and integration issues, particularly concerning the shared master data. This is a major consistency issue for the business users, who expect a harmonized and complete view of their business. Typically Sales organizes revenue differently than Finance for example. They often find themselves in situations where these inconsistencies have been patched by manual XLS and/or ETL, so although their reports may seem consistent, they aren’t fully accurate and don’t reflect the business. The challenge is when one of the business needs change. Without MDM these tasks pose significant time and effort versus having a process, tool and designated steward to manage updates quickly.
#9: BI Inconsistencies
Most BI tools offer a conceptual layer, enabling the definition of data in business terms and mapping these to the data sources. These architectural components act as a buffer between the underlying complexity of the sources and the reports that users work with.
But this architectural layer is fragmented by its nature because it does not hold persistent data across the business. There is no guarantee that a business rule defined for an object is consistent with another rule used on another report. These conceptual layer components are usually maintained independently, so they can become disconnected from their main source of data.
#10: Responsiveness to Change
The business world changes frequently. The business returns to IT with new requirements, such as a new chart of account structure, or something even more significant, like a company merger. As a result, changes to the data structures and infrastructure are needed, and the business usually requires a rapid response. Just as importantly, the business needs to keep a corporate memory, so that meaningful comparisons can be made between what was planned last year with today’s actual business.
It is very difficult to keep to design and development principles when business requirements, the data modeling tool, the underlying DBMS, the ETL tool and the BI tools are all disconnected and there is pressure to deliver quickly.
Here is how Magnitude can help…
Magnitude addresses all of these issues by using the business information model to drive all aspects of the system. Let’s look more closely at how a Magnitude business information model, created and managed using BIM, impacts a project and addresses the 10 disconnects discussed above.
A Magnitude project typically starts with a business modeling workshop, where the solutions architect leads a small team, which must include business representatives, in a discovery session to identify what is in scope and identifies the key performance indicators (KPIs). This first BIM model helps business people see how their data can be analyzed and compared with other data sources.
An architect then works with the source systems specialists to check that data with appropriate quality is available, and whether derivations for KPIs are possible. With this knowledge, the business information model can be improved and validated by the architect and the business representatives.
The model is deployed for the first development iteration and a complete database repository is generated automatically. Next, data is loaded and simple reports can be generated. The business representatives review the initial reports and, working on the model with the architect, they decide on changes to improve it.
When new versions of the model are deployed, the system is automatically modified without any coding or interception by developers or DBAs. This cycle is repeated until the business users are satisfied that their requirements are met.
Subsets of the business information model are then exported into the tools used for reporting, ensuring consistency across the business. All semantic definitions and the technical metadata are passed across, which ensures a consistent, accurate and complete source for the reporting tools. Project owners can be confident that the data is defined and structured in the way that the business really needs. The business information model documents exactly what has been implemented.
Magnitude’s approach to complex integration projects using the BIM modeler can help eliminate disconnects and accelerate project development.
- The business requirements are reflected in a structured model understood by business people and the IT staff.
- The logical model is eliminated from the process because it is built into the software.
- The architect influences generation of the physical schema, which is generated using best practice techniques.
- The data is stored in conventional tables and can be tuned without impacting the design.
- Change during development is carried out at the business information model level so there is no disconnect.
- The implementation is always reflected in the business information model.
- Semantics are carried from the business information model to the system.
- The user has control of inconsistent master data in the source systems.
- Semantic information can be carried through to the reporting tools.
- And when the world changes or the requirements change, the model and therefore the system changes with it.