Product information modeling concepts and techniques

Turn Information into Knowledge – Business Information Modeling Concepts and Techniques

Applying a model-driven approach to data management is key for meeting the speed to market and flexibility needs of today’s businesses.

Rather than requiring a complete model of the entire enterprise-wide set of requirements before deploying the warehouse, Magnitude’s Business Information Modeler (BIM) allows you to start small, iterate, and deploy in increments. Read more

How to fix bad data: The benefits of MDM and DWA

Running your business and analyzing results with accurate and consistent data is mission critical, yet most organizations struggle with bad data. An IBM study estimates that $3.1 Trillion of America’s GDP is lost due to bad data and 1 in 3 business leaders don’t trust their own data.

Top 10 Disconnects Encountered During Complex Integration Projects and How to Avoid Them

In our recent blog post, Connecting, Collecting and Understanding Data, we discussed the importance of semantic integration and focusing on the business view of data to help ensure key information isn’t lost during project development.

Too often, “disconnects” between IT and the business can delay or derail a master data management or data warehouse project, and/or cause significant cost overruns. The Magnitude Business Information Modeler tool (BIM) promotes semantic integration because it models the way data is consumed by the business, not the way it is stored. In this article, we will review 10 areas of disconnect these types of projects often encounter, then we will discuss how BIM, with its business information model driven approach, minimizes these disconnects and drives project development. Read more

Connecting, Collecting and Understanding Data

I read with interest a Gartner report, Modern Data Management Requires a Balance Between Collecting Data and Connecting to Data, that made the case for a bi-modal approach to connecting and collecting data. The case being made is that being able to react “at the edges” of broader data infrastructure (making decisions based on real time data displayed on a tablet, for example) requires direct connection of processes and devices, while collection of data for operations and management insight requires a central collection point and rigorous validation of data accuracy and quality. However, ALL data requires a series of integration processes that describe, organize and integrate the information. That first step, the description, includes the location, trustworthiness and meaning of the data in question. Read more

Omnichannel_360_view_of_customer

Winning in Omnichannel begins with a 360 View of your Customer

Customer data has never been more valuable given the morphing of retail into an omnichannel world. In this age of the customer, the holy grail is capturing a 360 view of your customer both online and offline.

Just look at the recent Amazon and Whole Foods deal. According to WSJ, the big prize in this deal lies in the data. By bringing together online and offline data, Amazon can entice customers to make more impulse purchases online. Coordinating all the online and offline data points into a single person’s profile is the ultimate quest for any customer-centric brand. Google’s recent announcement to bridge the “online ad–offline purchase” gap to track credit and debit card transactions and link them to online consumer behavior is another leap forward in this area. Read more

CTO perspective: Is your business equipped for the API economy?

As a CTO, I find it it fascinating to watch the explosive growth of the API economy. APIs have become the fuel that’s enabling companies to create and launch new business models at unprecedented speeds. According to programmableweb.com, there are more than 16,000 APIs. HBR cites that Salesforce.com generates 50% of its revenue through APIs, Expedia.com generates 90%, and eBay, 60%. Read more

Finally: Analyst Coverage of Data Warehouse Automation

In response to my colleague Stephen Pace’s blog post about the lack of a “magic quadrant” for data warehouse automation, we’re happy to report that industry analyst coverage of this growing market segment has finally occurred!

Read more

Are Graph Databases the Shiny New Object in MDM?

I recently read a great blog post on smartdatacollective.com, From Master Data to Master Graph by Peter Perera.  I found myself agreeing with almost everything in the post, particularly once I realised he was using terminology slightly differently to how we would here at Magnitude Software (in particular I suspect he and I think of different things when we refer to “MDM Applications”).  What’s interesting is that although I’m in broad agreement with the arguments made, I’m not yet convinced with his conclusion of the post (which as I understood it is that graph database technology is the best foundation for master data management systems). Read more

Announcing Kalido 9.1 SP2 – And a New Product

We were excited today to announce the latest release of the Kalido Information Engine, version 9.1 SP2. This release has enhancements in both our data warehouse automation capabilities and our master data management capabilities. In this post I’ll recap a few of the most notable ones.
Read more

Fear or Greed: What’s Driving Your Information Management?

Lately we’ve been spending a fair amount of time talking to banks and investment firms about their information management challenges. One industry insider recently summed it all up for me by stating that organizations in this industry are motivated by one of two things – fear or greed. It might sound a bit crass, but that’s not a bad thing and I think it reflects the reality of the market we all, not just banks and investment firms, operate in.
Read more