The Big Data explosion has arrived and is here to stay. Some industry commentators who have tried to label the Big Data phenomenon as vendor generated hype must now be eating their words. Recent industry analyses and forecasts have demonstrated that Big Data has moved rapidly from the sandpit into the heart of many enterprise data architectures.
A recent Wikibon study calculated that global spending on Big Data (including hardware, software & services) reached $18.6 billion in 2013, representing a growth rate of 58% over 2012. In 2014 this is expected to grow further to $28.5 billion. Overall Big Data related investment is expanding at fully 6 times the rate of overall IT spend. Even the normally cash strapped and cost cutting UK government has recognised the potential value of Big Data, announcing last month that they are putting £73 million ($122 million) into Big Data research, recognising its potential benefit to the whole British economy and the need for the UK to be at the vanguard of the explosion.
But this boom comes with significant risks. Of the $4 billion spent on software last year, only a tiny fraction was invested in data quality and assurance tools. And as we all know Big Data, like all data, is only a valuable asset if it is fit for purpose. Without active data quality controls and assurance there’s a great danger that many companies will build massive Big Data repositories filled with data that is at best less than fit for purpose and at worst useless. We’ve all seen it before with Data Warehousing but this time it’s on a bigger scale.
And the probable outcome? Much of the billions invested will fail to show any business benefit and the Big Data boom will eventually go out with a big bang. So let’s anticipate that some of this massive growth in expenditure in 2014 will be invested in the data itself, not just in the infrastructure needed to store and process it. I live in hope. As John Wayne once said, ‘Tomorrow hopes we have learned something from yesterday.’
VP Information Management Strategy, Trillium Software
Nigel Turner works with Trillium Software clients to start, expand and accelerate their enterprise data quality initiatives. He spent much of his career at British Telecommunications plc (BT) where he led an internal enterprise wide data quality improvement programme. This ten year programme was praised by Gartner, Forrester, Ovum Butler and others both for its approach and proven benefits. Nigel has published several papers on data management and is a regular invited speaker at CRM and Information Management events. He is also a part time lecturer at Cardiff University where he teaches data management.