Business resiliency doesn’t have to be a one trick pony

Business resiliency doesn’t have to be a one trick pony

copy data In my twenty years of enterprise infrastructure experience, I’ve noticed a few things that are universal to every organization.  One of the most universally time-consuming things about working IT is usually disaster recovery testing. We all know that business continuity is extremely important, but that doesn’t make testing and executing recovery plans any less expensive.  It takes compute power to takes full and incremental copies of the data and, of course, storage to house the backups.Organizations also spend weeks and weeks of people’s time planning, documenting, executing, and remediating disaster recovery plans.  Until needed business resiliency often seems like a waste of money and time – but that all changes when you need it. When finally needed everyone remembers what a great investment data protection is, but what about all the rest of time?  Can’t data resilience be more than a one-trick pony?

The simple answer is “yes” it is possible to use all the data copies created for data resilience.  This concept is starting to make some seriously awesome improvements to organizations embracing it.  One of the early pioneers, actifio, coined the term “copy data virtualization” but others are calling it other things. The goal being to take the backup data and put it to use for more than just backups. This makes perfect sense to me since the data is sitting around doing nothing most of the time. Actifio lets IT do some really awesome stuff, but at its heart it lets users provision copies of data to be used for something.  What exactly does that mean in the real world?

Let’s say we have an Oracle database administrator who is planning to make some major database changes.  He doesn’t want to run this on the current development environment because that could impact the development teams. Traditionally the DBA is going to ask for a new environment to be provisioned, copy the data, and all the associated work like getting firewall rules. That is a huge work effort for everyone involved. This is where actifio can shine.  Using their software actifio takes the copy of production, makes an instantly space efficient clone and presents it to an existing system.  The DBA can then import the data and test his script.

We can take this to the next level because actifio has an extensive REST API, which can be leveraged by software plus salt or Jenkins.  This allows a developer to take a copy of data for their development efforts.  It becomes trivial for development to have a robust workflow which can clone data and servers for test. They have a great demo of this using ansible below for those who are interesting.

 

This idea of making use of data we already is really cool,  but can be hard to understand.  At Tech Field Day 11 we had some time to talk to the actifio crew,  and I have to say they are doing some really impressive stuff.  It’s so impressive, in fact,  that other vendors are starting to use their marketing terms.  Check them and other vendors are going copy data management out.  I think something like this is key to the future of data protection.

Comments are closed, but trackbacks and pingbacks are open.