The 9 biggest BI challenges in postmodern ERP era

A recent Gartner report, 2018 Strategic Roadmap for Postmodern ERP, describes how today’s organizations tend to focus on monolithic, vendor-first Enterprise Resource Planning (ERP) strategies that do not support their digital business initiatives. The report further recommends a postmodern ERP strategy that includes the integration of differentiated and innovative capabilities that are beyond the core strengths of the common ERP platforms.

As more companies adopt a business technology strategy focused on customer outcomes, some functions tend to be decoupled from the ERP footprint and pushed to the cloud, but the need for core ERP transactional support, at a minimum, remains.

There are several cloud application adoption scenarios in the postmodern ERP era:

  • Supplementing a ERP system with Software as a Service (SaaS)
  • Quickly supporting mergers, acquisitions, and divestures
  • Replacing aging ERP systems across the enterprise
  • ERP for fast growing digital enterprises

In the next three years, most companies will be running hybrid architectures – a loosely coupled mix of cloud and on-premise ERP applications. Business Intelligence (BI) and analytics leaders face the complex challenge of keeping up with the rapid pace of postmodern ERP varying architectures and integration possibilities, while trying to bridge the gap for future needs of the user groups.

Following are the barriers which come in the way to harness the desired level of business insights in the postmodern ERP era:

  1. ERP data preparation challenge

    Yes, this is still a challenge to transform the ERP source data into meaningful and business process centric insights. The ERP solutions available often rely on complex and ever-growing data models. Unfortunately, the complexity of the ERP data models makes it hard to pick the right data to analyze and visualize. Simple report requirements based on supplier may be based on 10 tables with intricate links and rules. Requirements like genealogy, obsolete inventory, subledger financial documents and reconciliation are further complex irrespective of the ERP system.

    The challenge, then, is to transform complex ERP data into something more easily understood and used by managers and reporting analysts alike, while still honoring the entity relations in the application. Lacking resources with a complete understanding of the ERPs data models is the most common reason why reporting is complex, takes longer than expected, and possibly contains grievous errors.

  2. Promise of self-service analysis

    Business agility and decision-making has never depended more on data analysis than today. At the same time, analyzing data requires increasing levels of business understanding of source applications.

    The challenge here is enabling business users with easy access to understandable, reliable data of source applications. Without this capability, they will spend more time in finding the relevant data and less analyzing. And more so will find themselves doing more trial-and-error tests resulting in frustration and the possibility of inaccurate reporting.

  3. Consolidation and data mash-ups from disparate applications:

    Consolidation of data from multiple source applications and across business areas in one easily accessible location helps creating meaningful reports. Users therefore get a business-centric view of all their organizational information that is required for a specific challenge, rather than constraining with application specific view.

    As organizations move to hybrid data management landscape, the complexity of integrating systems grows with each additional application to be supported with diversified data extraction format.

  4. Real-time data ingestion:

    The ERP transaction data is synched company wide, so that data entered is available across the organization to adopt timely, data-driven decisions. Real-time reporting has become the standard.

    The primary challenge here is to keep up with ever growing source data to be processed via Extract, Transform, and Load (ETL) or Extract, Load, and Transform (ELT) with a shrinking batch window to support real-time report requirements.

  5. Pre-built Analytics and Reporting:

    Business Intelligence tool vendors strive to facilitate user-driven reporting from enterprise applications by providing generic report templates, the goal being to help jumpstart the reporting process.  However, these vendors’ tools still require a significant amount of highly-skilled resources and manual effort to create the data structures and business logic that make sense to business users and enables the long promised self-service analysis.

    The key to being successful with any BI tool is the quality and relevance of content across the users including data sources, business logic, and organization of the data elements. This is the critical component and can take many months to collect the necessary requirements, design, build and test to ensure acceptance with the business users.

  6. Diversified BI tools in an environment:

    Today, there is a diverse set of BI tools available to organizations.  With each BI tool having similar functionality and different requirements across business groups, progressive organizations are simply offering a menu of tools rather than dictating a choice allowing the business users to choose the right tool for the job.  The challenge with a people first policy is providing a standardized data access layer across all tools in an organization’s portfolio allowing the user the comfort of being able to access a familiar data model with whatever tool they choose and simplifying the way they are supported.

  7. Rise of citizen integrators:

    With an exponential rise in variety of data sources, gone are the days where integrations are centrally owned by IT. With the operating departments making new software purchases, integration projects are being led by respective business teams.

    The challenge, then, is to create blended reports by “correctly” joining the entities across the source systems. Most of the issues with this will not surface until the report and its data is used in a real- business setting.

  8. Maintainability and Extensibility:

    The only constant with technology today is change.  With a high frequency of change, organizations are challenged with keeping pace.  As data evolves from source applications, either due to customization or upgrades the dependent reports need to be modified as well.  Modifications can range from simple naming changes to a complete redesign of a critical set of reports.  If these necessary modifications are not identified and addressed they can often lead to user frustration, or worse, abandonment of IT initiatives and systems.

    The challenge is to build a capable tool based on technical metadata to support the reporting system, which otherwise may require specialized resources or overloading existing resources that may be allocated to more strategic projects.

  9. Enabling smart data discovery:

    BI vendors are shifting to automate the analytics workflow from preparing and modeling large data sets into meaningful information to sharing the insights and explaining the findings using conversational analytics capabilities including natural language processing, querying, and generation.

    The challenge is to build frameworks which allow users to associate business metadata while building the data platform to support natural language analytics. Without such business metadata, the data platform can’t be future ready.

Not every barrier needs to be overcome on day one, but over time, most will become requirements for a future-ready, people-first data platform to support business analytics and provide meaningful, timely insights in a postmodern ERP era. Fortunately, Magnitude Unified Business Analytics has been built to simplify exactly these tasks and enable faster, and more reliable implementation modern BI.