Data Protection with HPE SimpliVity, Part 1: Hyperconvergence Designed to Protect Data
Protecting data is no small task, which is why the IT industry is filled with more and more data storage and protection products. The long list of products in any given datacenter usually starts with a software package to read data off of the servers and place it somewhere else. The “somewhere else” could be a tape library, disk-based backup appliance, or another array – or all three. The industry standard 3-2-1 backup rule is widely applied for customer’s data protection needs (keep at least 3 copies of your data, store backed up data on 2 different types of media, store at least 1 copy of data off site).
In addition to backup, customers are also implementing replication, which usually includes another array and possibly additional software to manage the replication. All of this reading, writing, and copying of data may require upgrades on the local network infrastructure, WAN connections, WAN accelerators, and/or storage acceleration to make up for the extra load. This quickly increases the operational overheads, rack space needed, and total cost of ownership for the IT infrastructure.
Part of the simplicity that HPE offers with HPE SimpliVity powered by Intel® is reducing the need for so many products. A hyperconverged platform that truly makes IT simple must consolidate multiple components and redefine how data is managed and protected. This architecture started with some very specific design goals.
Design Goals: Simple management, data efficiency, data protection
The technology that the HPE SimpliVity product line is based on is the Data Virtualization Platform. The software is built on three main offerings:
- Global VM-centric data management and mobility
- Data efficiency
- Built-in data protection and resiliency
All three of these capabilities are highly connected with one another and are unique in the industry in the fact that they really can’t exist if a product isn’t designed and built from the ground up with these items in mind.
Global VM-centric data management and mobility is focused on reducing the management of data across its lifecycle, simplifying it by eliminating interfaces and manual tasks, and making the data visible and movable across a global infrastructure. To do this, all HPE SimpliVity management has been integrated into the native hypervisor management tools, completely eliminating any new interfaces and putting management right where virtualization admins would want their data controls. By combining clusters from multiple geographies into a single federation, customers gain a single interface to manage their global virtualization infrastructure.
Data efficiency is a key enabling technology to the entire platform, utilizing deduplication, compression, and optimization of data. Placing all data into this efficient state the moment it is created reduces capacity and consumed I/O. Maintaining it in this state makes it possible for the platform to perform data operations faster and much more efficiently. For example, HPE SimpliVity clones local backups and local restores are guaranteed to happen in 60 seconds or less. When moving data between clusters, deduplication means only unique data is transported and compression means the data that does need to be moved is already in a compact form.
Built-in data protection and resiliency means that the platform was designed to absorb component failures while still keeping data available, and also provides the ability to easily recover data should loss occur. Utilizing the high quality and redundant components within the HPE ProLiant DL380 and Apollo platforms means nodes rarely go down. When they do fail, all data is already available and ready to use on another node. Data efficiency combined with built-in data protection means that recovery point objectives can be measured in moments and recovery time objectives can be measured in seconds.
Simplicity isn’t just about hiding complexity. Hidden complexity almost always shows its ugly head, so architecting out the complexity is the only way to completely eliminate the problems complexity can cause. A great example of this is Western Tool & Supply, who had such a strong focus on refreshing infrastructure that they wanted to “get IT out of the way.” They found many advantages after implementing HPE SimpliVity, but have found that data protection was particularly useful in making IT simple, including the implementation of a full Disaster Recovery solution.
That’s a high level look at how HPE SimpliVity was designed to simplify IT, including the sometimes-overwhelming efforts to protect valuable data. This blog is the first in a 3-part series. Look for future posts that will dig deeper into how data protection actually works on HPE SimpliVity infrastructure. Until then, I recommend the Gorilla Guide to Hyperconverged Infrastructure for Data Protection to learn more about the basic concepts of hyperconvergence and how it changes the approach to protecting data.
For a deeper dive into HPE SimpliVity technology, check out our 4-part blog series on Defining Hyperconverged Infrastructure:
- Part 1: The anatomy of the HPE SimpliVity 380
- Part 2: Lifecycle of a write I/O on HPE SimpliVIty 380
- Part 3: The importance of data locality
- Part 4: HPE SimpliVity data storage built for resiliency