How to be the master of mastering your master data

In my last post, I discussed how important it is to be able to effectively model your master data. In this, the second in a series on the topic of master data, I want to discuss how you need to master it. By “master it,” I am referring to the workflow-driven process steps people go through to achieve clean, consistent, accurate and harmonized master data, and having the relevant technical capabilities to make this process work smoothly and efficiently. And by “people,” I mean everyone involved.

There are a number of key capabilities required of your MDM application to effectively drive the mastering process of master data management. Here I will highlight three key ones:

First, the people involved in the process of mastering the data– data stewards, approvers, authorizers and MDM consumers — all need a means to interact with the data that aligns with their role. Even better would be an interface that is additionally driven by the context of the data. If a step in the process of making accurate master data requires a form to be completed, ideally that form is created automatically by the data and its elements, the step in the workflow, and what role the user plays and their permissions. This capability would eliminate customized interface development costs for every step in the process, and would allow you to have a more flexible process to accommodate changing stewardship actions, which may be necessary as you expand your master data domain management scope. More importantly, such a capability makes the process easier for the people involved, thereby increasing the efficiency and overall success of your MDM program.

One key area we see more and more customers requiring this for is authoring new master data from the MDM application itself. In many cases customers find it makes more sense to create new records in the MDM environment where all the necessary elements can be tightly controlled, and then deploy the master record to the systems that need it. This can include both operational systems as well as analytical systems when you are authoring things like market segments or other views that aren’t typically created in a transaction system.

Second, you need the agility and flexibility to keep pace with changing business requirements. If you have reorganized, or you are adding new data sources, or you acquired a company, or your business process changes because the business needs a new way of viewing your data, your MDM environment ought to be able to quickly and easily model the change, and also implement the data governance processes needed to master it, whether it means new governance steps or new forms. I’ve blogged before on business modeling, and the same concepts need to drive the structure of your MDM repository.

Third, you need to be able to manage change “in flight” while not disrupting current business processes. When there is a change request for a change to published master data, you should be able to simultaneously investigate and process that change without impacting what’s used in existing business processes. Instead of making data unavailable, there should be a separation between the data that is “work in progress” so it can be determined to be valid or not. When it is validated, only then is it published and available to users and business processes.

Here’s an example of how this should work:

  • Suppose a CPG product manager changes the business model to add a new product grouping within a brand. This might be done because of a promotion or because they want to track similar products within a brand.
  • Your MDM tool identifies all affected product master data and either automates the change or flags them for review – which step happens next will be driven by the relevant workflow step.
  • The product data steward, or one of a group of several data stewards on a team, would then review and make changes from a user interface automatically generated based on the data and task at hand.
  • The brand manager may be the person who needs to approve the change, which would then be “published” to the master data repository for use in brand and category analysis.
  • The next time the CPG product manager views the data via the MDM consumer interface, the master data will be updated to show the new relationship in the data.

As you can see, one of the keys to enable this activity is being able to provide an interface that fits the task at hand as I mentioned in the first point above, and also drive the process through an embedded workflow capability with alerts and approval steps along the way. This would be a very cumbersome process indeed if IT had to anticipate this workflow and design custom screens for each step and each different user along the way. Imagine further how complex, as well as inconsistent, this would be if you were using multiple different MDM tools to handle different MDM domains.

We all know that master data management needs to be both people-driven and process-driven to be effective. But that doesn’t mean it needs to be manual. By developing an environment and selecting a tool that makes the process of master data management easier for all involved, you can significantly improve the success of your MDM program.

4 replies
  1. Simon
    Simon says:

    “This can include both operational systems as well as analytical systems when you are authoring things like market segments or other views that aren’t typically created in a transaction system.”

    Could I ask you about this comment please? My exposure to Master data is predominantly an SAP system. As I understand it, there is no reason why market segment type metadata can’t be captured within the SAP master data. Are you saying that in your experience this doesn’t tend to happen if the metadata is only needed for BI? The implication is that this metadata then gets captured somewhere else and that seems undesirable to me?

  2. Lovan Chetty
    Lovan Chetty says:

    Simon, you are correct that there is no reason why market segmentation type metadata cannot be captured within the SAP operational system or any other operational system. The general problem that we have seen with is that the operational system has built in functionality to ensure that the transaction processing occurs as efficiently & consistently as possible. However for elements, like market segmentation, which traditionally do not affect the end to end transaction process there is no built in functionality. This usually means that you need to custom build all the management processes to support this data. At the simplistic end these are just screens to allow people to manage customer segmentation definitions, ensure that segments are not duplicated and that if the same customer is in multiple segments then the appropriate allocation percentage is applied. However customer segmentation has direct impact to both Sales & Marketing. Who then has control of the changes to segmentation? Who decides the allocation percentage? Do both parties need to be informed of changes? So the custom data entry fields have now expanded to include the desired data management process. Marketing would love to have the segmentation data in their campaign management system so you now also have to build external interfaces for this data.

    So what happens is that a mini MDM or data management system is built within the ERP. This is usually not desirable as that is not the core competency of these systems. This is also not a view just held by MDM vendors like Kalido but the ERP vendors like SAP & Oracle have recognized this as well which is why they have MDM products too. The management of this kind of data is fundamentally different from transaction processing. It is usually also more widely applicable which is what typically warrants its own system.

    This of course seems to exacerbate your point of the meta data being captured in multiple systems. I would assume that your company has more than one system that has customer data. Even if each of the operational systems is from the same vendor the CRM module probably has a customer model that is slightly different from the customer model in the order processing module, which would be slightly different from the customer model in your campaign management system. I typically see systems, ERP, MDM etc be implementations of a higher level Business Model. This Business Model should hold the corporate definitions as well as the high level associations between entities. Some people see this as a conceptual model. This conceptual model is then “deployed” to each system that supports a particular process. Nirvana of course is that the deploy of the model to the system is an automated process.

  3. Simon
    Simon says:


    Thanks for the explanation.

    So *where* does the segmentation data get captured? If its in a separate system to the ERP system, that suggests to me there has to be a two stage entry process – firstly, in the ERP and then secondly, additional metadata is added in the MDM system? Or am I missing the point and its all added at once in the MDM system and then despatched to the systems that need it?

    …newbie to this MDM lark :-)


  4. Lovan Chetty
    Lovan Chetty says:


    If I understand your question correctly you are asking where the actual segmentation data is captured rather than the metadata about segmentation. The segmentation data would be captured and managed in the MDM system. There is no need for the segmentation data to reside in the ERP unless it is needed for ERP processing.

    So to expand on this, you could decide that the ERP is the master for customer code & name. These elements would be fed into the MDM system (both metadata & instance data). The metadata in the MDM system would be enriched to include customer segmentation. The segmentation data would then be created & managed in the MDM system. All systems that need customer segmentation information obtain a copy form the MDM environment.

    The more advanced option is to master both customer & customer segment in your MDM system and have the ERP access customer information from the MDM system the same way that all other systems would. This does not mean that you would be forced to author your customers in the MDM system. Most ERPs now allow callouts for master data so you could still have the data capture occur in the ERP. The ERP then calls out to the MDM system ensures that the customer does not already exist & that the data entered conforms to the defined rules.

    This give you a solution where all the ERP s expected to do is the portion that it excels at; processing transactions. You have a separate system that excels at the management of master / reference data perform that task.

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply