Expert Warns Legacy Systems will not Deliver Big Data benefits
Posted on Friday, May 1, 2015
In the age of Big Data, many businesses are struggling to design and implement platforms that process vast, unstructured data sets. With an extensive, legacy data warehousing infrastructure in place, many organizations start their Big Data efforts by attempting to ‘add’ a processing layer that sits on top of it.
However Big Data relies on lightning-fast access and processing of vast data sets, something that traditional spinning disk systems struggle with. When trialling Big Data technologies and techniques, legacy storage provides a perfectly adequate data layer. But when moving into production, the lags and delays will quickly prove to be unsatisfactory.
Alex Mullan, Field CTO at Pure Storage and storage expert says, 'The key challenge with big data is that the cost of manipulating and conducting complex analysis on the volume of data collected can be prohibitive. It can also be cripplingly slow - if you can only ask a question once a week, you might be missing out on business-critical insight in the intervening days. If you're stuck with spinning hard disks, be prepared to spend significantly to attempt to make it move faster.'
Aged but not useless
With OEMs attempting to push clients into flash-based system upgrades, and experts openly proclaiming the end of spinning disk storage systems, CTOs have a legitimate reason to be concerned about the ongoing value of their legacy hardware. Yet despite this negative outlook, and the apparent fact that legacy storage is unsuitable for Big Data, older systems continue to deliver value for their owners.
Backed by a suitable post-warranty support and maintenance agreement, legacy storage systems can be used for data warehousing and archiving purposes indefinitely. But to fully embrace Big Data, CTOs will need to begin making the business case for new investment.