65 million years ago, a six-mile-wide asteroid ended the reign of the dinosaurs, reopening ecological niches that were promptly filled by members of the class Mammalia, one of which, a long time later, went on to invent things like writing, the wheel, and information technology.
Big Data—ultra large scale data storage and analysis—is the storage market’s equivalent of that big rock, but rather than causing mass extinction, it’s simply opening up a lot of new ecological niches for storage technologies, especially advanced solutions like Fusion ioMemory modules and the new release of DataCore’s SANsymphony-V storage hypervisor. The advent of Big Data also offers a unique opportunity to re-architect your storage management infrastructure in a way that can prolong the life of storage “dinosaurs” and make it more adaptable in every respect and more easily aligned to business needs.
As profiled in the New York Times, Big Data is transforming business, government, and education. One researcher reported that a study of 179 large companies showed that those adopting the data-driven decision-making that Big Data makes possible “achieved productivity gains that were 5 percent to 6 percent higher than other factors could explain.”
Big Data is more than just big. It’s restless, too, and best used when hot. Let it cool off, and you lose the situational awareness that can lead to big-time financial rewards. It’s not just a matter of storing a gazillion bytes—you can’t possibly store it all, so your retention policies have to change, and the need to widely share data as quickly as possible means your networking strategies have to change as well.
Fortunately, a storage hypervisor can be a big help in adapting to Big Data. Even better, the benefits of this software layer, which insulates you from all the hardware variables that Big Data can throw your way, kick in long before Big Data arrives. A scalable and comprehensive storage hypervisor like DataCore’s SANsymphony-V is an agent of change: you get the pay-off today and a future-proof storage infrastructure. It also, as we’ll see, can give you an even better return on your Fusion-io ioMemory module investments.
SANsymphony-V provides a complete “storage management stack” and gives you a centralized console that enables you to efficiently pool all your storage resources, mirror and replicate data for high availability, cache data near applications for higher performance, automatically allocate space, and direct traffic to the optimal tier.
Resource pooling has the most immediate impact, because you can aggregate all of your storage capacity, without regard for brand, model, or interface, and easily reclaim unused space. These pooled resources can be easily mirrored locally for high availability, or replicated remotely for disaster recovery. Thin provisioning gives you just-in-time storage allocation for highly efficient use of disk space, and RAM caching speeds up “spindle-based” storage dramatically to turbocharge native disk array performance. The fact that all of these advantages are available to every storage resource managed by SANsymphony-V means that older storage that formerly might have been shuffled off to the dinosaur’s graveyard remains useful longer, leading to a higher ROI for all your storage investments.
Fusion-io Fast Flash Memory and DataCore Auto-tiering Software
When it comes to Big Data, however, it’s probably auto-tiering that’s likely to be of most interest to customers who rely on Fusion-io technology. SANsymphony’s auto-tiering can dynamically direct workloads to the right storage resource based either on access frequency or business rules, so that the hottest data gets the most attention. Older storage can be moved down-tier as new hardware is installed, again prolonging its service life. SANsymphony-V also offers a “cloud gateway” to leverage cloud service providers for both disaster recovery and archival of virtually unlimited capacity—a necessity to keep from getting squashed by Big Data.
This enables SANsymphony-V to put Fusion-io’s server class memory tier at the very top of an agile, easily-managed storage hierarchy that offers unprecedented levels of performance and availability. You can easily balance data value and the need for speed against price/capacity constraints—something that Big Data is going to make ever more necessary—and make sure that you get the utmost benefit from ioMemory modules.
The fallout from Big Data is going to transform business computing at every level, so if you don’t want to end up a data dinosaur, now’s the time to transform your infrastructure with a storage hypervisor. A good place to start is Jon Toigo’s Storage Virtualization for Rock Stars series, starting with Hitting the Perfect Chord in Storage Efficiency, which will give you a good overview of how a storage hypervisor can help you increase engineering, operational, and financial efficiency.
- Fusion-IO Continues Its Path to Dominance in Storage Market (c24.co.uk)
- VDI and the “The Psychic Desktop” (c24.co.uk)