Nucleus Research released a new study this week “Measuring The Half Life of Data” introducing a framework for gauging the data’s value and the effects of time on that value. They use the scientific concept of “half-life” to reinforce the notion that data value diminishes over time and at predictable rates. The rate of decline is based on whether data is used for tactical, operational, or strategic purposes.
This is an interesting concept and one that clearly has application as companies plan data management investments. Value should be the key discriminator in how much to spend managing data. In other words, if its value is “x” times greater (you get to fill in the “x”) than the cost of managing it, you should make the investment.
It is important not to infer that this framework best benefits product manufacturers. Any company, in any industry, will find itself making decisions along operational, tactical and strategic horizons. The point here is that creating a data foundation that is both model-driven and agile is key for two reasons:
- A model-driven approach provides structure for rapid data integration, a key to exploiting data with shorter half-life.
- Agility enables rapid response as decision horizons and data requirements change.
There’s that word again…”agile”. Here it makes perfect sense. If you want to fully leverage this half-life framework, you absolutely will need an agile data foundation at the base. Otherwise it will be difficult to capture data’s peak value, or mine its lingering value over time.