Preparing for a Genuine Big Data Future
Posted on Wednesday, May 18, 2016
Every forward-thinking business is now aware of the value of Big Data, and most have begun building programs accordingly. In fact, IDC forecasts that the market will grow at almost 25% annually between now and 2018.
But all is not well in the majority of Big Data programs. Gartner predict that 60% of projects will fail to make it past initial pilots. They will then be abandoned completely.
Speed is not the only issue
Being able to generate and access insights quickly is essential to the success of a Big Data program. But it’s not the only factor businesses need to address if they want to succeed.
Of greater long term importance will be the way that data is treated. The project-based approach to many Big Data programs means that the data opportunities to realise additional value are missed simply because data is viewed as a disposable resource.
The reality is that the true value of Big Data comes from realizing that information assets are multi-use, multi-purpose. This is followed by building a data infrastructure that can natively support data re-use to provide further ‘value creation opportunities’.
Ease of use, and improved data management will subsequently be of far greater importance than the actual speed of the hardware used to store unstructured data.
More storage, not new storage
Instead of investing in brand new, standalone infrastructure to support single-goal oriented Big Data programs, enterprises need to be focused on adding capacity to their systems. With the use of Big Data management tools, volume becomes more important than speed.
At which point the need for “new” storage hardware is also of lesser importance. Repurposing post-warranty hardware, or even purchasing second-user units to increase capacity could help businesses reduce their Big Data project costs – even if they are unable to enact the necessary mindset changes for their programs to succeed long term.
Need further advice? Get in touch.