Wednesday, June 6, 2012

Data explosion and big data demand new strategies for data management, backup and recovery, say experts

Listen to the podcast. Find it on iTunes/iPod. Read a full transcript or download a copy. Sponsor: Quest Software.


Businesses clearly need a better approach to their data recovery capabilities -- across both their physical and virtualized environments. The current landscape for data management, backup, and disaster recovery (DR), too often ignores the transition from physical to virtualized environments, and sidesteps the heightened real-time role that data now plays in enterprise.

What's more, major trends like virtualization, big data, and calls for comprehensive and automated data management are also driving this call for change.

What's needed are next-generation, integrated, and simplified approaches, the fast backup and recovery that spans all essential corporate data. The solution therefore means bridging legacy and new data, scaling to handle big data, implementing automation and governance, and integrating the functions of backup protection and DR.

To share insights into why data recovery needs a new approach and how that can be accomplished, the next BriefingsDirect discussion joins two experts, John Maxwell, Vice President of Product Management for Data Protection at Quest Software, and Jerome Wendt, President and Lead Analyst of DCIG, an independent storage analyst and consulting firm. The discussion is moderated by Dana Gardner, Principal Analyst at Interarbor Solutions. [Disclosure: Quest Software is a sponsor of BriefingsDirect podcasts.]

Here are some excerpts:

Gardner: Is data really a different thing than, say, five years ago in terms of how companies view it and value it?

Wendt: Absolutely. There's no doubt that companies are viewing it much more holistically. It used to be just data that was primarily in structured databases, or even semi-structured format, such as email, was where all the focus was. Clearly, in the last few years, we've seen a huge change, where unstructured data now is the fastest growing part of most enterprises and where even a lot of their intellectual property is stored. So I think there is a huge push to protect and mine that data.

But we're also just seeing more of a push to get to edge devices. We talk a lot about PCs and laptops, and there is more of a push to protect data in that area, but all you have to do is look around and see the growth.

When you go to any tech conference, you see iPads everywhere, and people are storing more data in the cloud. That's going to have an impact on how people and organizations manage their data and what they do with it going forward.

Gardner: Now, for more and more companies, data is the business, or at least the analytics that they derive from it.

Mission critical

Maxwell: It’s funny that you mention that, because I've been in the storage business for over 15 years. I remember just 10 years ago, when studies would ask people what percentage of their data was mission critical, it was maybe around 10 percent. That aligns with what you're talking about, the shift and the importance of data.

Recent surveys from multiple analyst groups have now shown that people categorize their mission-critical data at 50 percent. That's pretty profound, in that a company is saying half the data that we have, we can't live without, and if we did lose it, we need it back in less than an hour, or maybe in minutes or seconds.

Gardner: So how is the shift and the change in infrastructure impacting this simultaneous need for access and criticality?

Maxwell: Well, the biggest change from an infrastructure standpoint has been the impact of virtualization. This year, well over 50 percent of all the server images in the world are virtualized images, which is just phenomenal.

Quest has really been in the forefront of this shift in infrastructure. We have been, for example, backing up virtual machines (VMs) for seven years with our Quest vRanger product. We've seen that evolve from when VMs or virtual infrastructure were used more for test and development. Today, I've seen studies that show that the shops that are virtualized are running SQL Server, Microsoft Exchange, very mission-critical apps.

We have some customers at Quest that are 100 percent virtualized. These are large organizations, not just some mom and pop company. That shift to virtualization has really made companies assess how they manage it, what tools they use, and their approaches. Virtualization has a large impact on storage and how you backup, protect, and restore data.

Once you implement and have the proper tools in place, your virtual life is going to be a lot easier than your physical one from an IT infrastructure perspective. A lot of people initially moved to virtualization as a cost savings, because they had under-utilization of hardware. But one of the benefits of virtualization is the freedom, the dynamics. You can create a new VM in seconds. But then, of course, that creates things like VM sprawl, the amount of data continues to grow, and the like.

At Quest we've adapted and exploited a lot of the features that exist in virtual environments, but don't exist in physical environments. It’s actually easier to protect and recover virtual environments than it is physical, if you have tools that are exploiting the APIs and the infrastructure that exists in that virtual environment.

Significant benefits

Wendt: We talk a lot these days about having different silos of data. One application creates data that stays over here. Then, it's backed up separately. Then, another application or another group creates data back over here.

Virtualization not only means consolidation and cost savings, but it also facilitates a more holistic view into the environment and how data is managed. Organizations are finally able to get their arms around the data that they have.
Before, it was so distributed that they didn't really have a good sense of where it resided or how to even make sense of it. With virtualization, there are initial cost benefits that help bring it altogether, but once it's altogether, they're able to go to the next stage, and it becomes the business enabler at that point.

Gardner: The key now is to be able to manage, automate, and bring the comprehensive control and governance to this equation, not just the virtualized workloads, but also of course the data that they're creating and bringing back into business processes.
Once it's altogether, they're able to go to the next stage, and it becomes the business enabler at that point.


How do we move from sprawl to control and make this flip from being a complexity issue to a virtuous adoption and benefits issue?

Maxwell: Over the years, people had very manual processes. For example, when you brought a new application online or added hardware, server, and that type of thing, you asked, "Oops, did we back it up? Are we backing that up?"

One thing that’s interesting in a virtual environment is that the backup software we have at Quest will automatically see when a new VM is created and start backing it up. So it doesn't matter if you have 20 or 200 or 2,000 VMs. We're going to make sure they're protected.

Where it really gets interesting is that you can protect the data a lot smarter than you can in a physical environment. I'll give you an example.

In a VMware environment, there are services that we can use to do a snapshot backup of a VM. In essence, it’s an immediate backup of all the data associated with that machine or those machines. It could be on any generic kind of hardware. You don’t need to have proprietary hardware or more expensive software features of high-end disk arrays. That is a feature that we can exploit built within the hypervisor itself.

Image backup


E
ven the way that we move data is much more efficient, because we have a process that we pioneered at Quest called "backup once, restore many," where we create what's called image backup. From that image backup I can restore an entire system, individual file, or an application. But I've done that from that one path, that one very effective snapshot-based backup.

If you look at physical environments, there is the concept of doing physical machine backups and file level backups, specific application backups, and for some systems, you even have to employ a hardware-based snapshots, or you actually had to bring the applications down.

So from that perspective, we've gotten much more sophisticated in virtual environments. Again, we're moving data by not impacting the applications themselves and not impacting the VMs. The way we move data is very fast and is very effective.

Wendt:
One of the things we are really seeing is just a lot more intelligence going into this backup software. They're moving well beyond just “doing backups” any more. There's much more awareness of what data is included in these data repositories and how they're searched.
There's much more awareness of what data is included in these data repositories and how they're searched.


And also with more integration with platforms like VMware vSphere Operations, administrators can centrally manage backups, monitor backup jobs, and do recoveries. One person can do so much more than they could even a few years ago.

And really the expectation of organizations is evolving that they don’t want to necessarily want separate backup admin and system admin anymore. They want one team that manages their virtual infrastructure. That all kind of rolls up to your point where it makes it easy to govern, manage, and execute on corporate objectives.

Gardner: Is this really a case, John Maxwell, where we are getting more and paying less?

Maxwell: Absolutely. Just as the cost per gigabyte has gone down over the past decade, the effectiveness of the software and what it can do is way beyond what we had 10 years ago.

Simplified process

Today, in a virtual environment, we can provide a solution that simplifies the process, where one person can ensure that hundreds of VMs are protected. They can literally right-click and restore a VM, a file, a directory, or an application.

One of the focuses we have had at Quest, as I alluded earlier, is that there are a lot of mission-critical apps running on these machines. Jerome talked about email. A lot of people consider email one of their most mission-critical applications. And the person responsible for protecting the environment that Microsoft Exchange is running on, may not be an Exchange administrator, but maybe they're tasked with being able to recover Exchange.

That’s why we've developed technologies that allow you to go out there, and from that one image backup, restore an email conversation or an attachment email from someone’s mailbox. That person doesn’t have to be a guru with Exchange. Our job is to, behind the scenes, figure how to do this and make this available via a couple of mouse-clicks.

Wendt: As John was speaking, I was going to comment. I spoke to a Quest customer just a few weeks ago. He clearly had some very specific technical skills, but he's responsible for a lot of things, a lot of different functions -- server admin, storage admin, backup admin.
You have to try to juggle everything, while you're trying to do your job, with backup just being one of those tasks.


I think a lot of individuals can relate to this guy. I know I certainly did, because that was my role for many years, when I was an administrator in the police department. You have to try to juggle everything, while you're trying to do your job, with backup just being one of those tasks.

In his particular case, he was called upon to do a recovery, and, to John’s point, it was an Exchange recovery. He never had any special training in Exchange recovery, but it just happened that he had Quest Software in place. He was able to use its FastRecover product to recover his Microsoft Exchange Server and had it back up and going in a few hours.

What was really amazing, in this particular case, is that he was traveling at the time it happened. So he had to talk to his manager through the process, and was able to get it up and going. Once he had the system up, he was able to log on and get it going fairly quickly.

That just illustrates how much the world has changed and how much backup software and these products have evolved to the point where you need to understand your environment, probably more than you need to understand the product, and just find the right product for your environment. In this case, this individual clearly accomplished that.

Gardner: How do organizations approach this being in a hybrid sort of a model, between physical and virtual, and recognizing that different apps have different criticality for their data, and that might change?

Maxwell: Well, there are two points. One, we can't have a bunch of niche tools, one for virtual, one for physical, and the like. That's why, with our vRanger product, which has been the market leader in virtual data protection for the past seven years, we're coming out with physical support in that product in the fall of 2012. Those customers are saying, "I want one product that handles that non-virtualized data."

The second part gets down to what percentage of your data is mission-critical and how complex it is, meaning is it email, or a database, or just a flat file, and then asking if these different types of data have specific service-level agreements (SLAs), and if you have products that can deliver on those SLAs.

That's why at Quest, we're really promoting a holistic approach to data protection that spans replication, continuous data protection, and more traditional backup, but backup mainly based on snapshots.

Then, that can map to the service level, to your business requirements. I just saw some data from an industry analyst that showed the replication software market is basically the same size now as the backup software market. That shows the desire for people to have kind of that real-time failover for some application, and you get that with replication.
We can't have a bunch of niche tools, one for virtual, one for physical, and the like.


When it comes to the example that Jerome gave with that customer, the Quest product that we're using is NetVault FastRecover, which is a continuous data protection product. It backs up everything in real-time. So you can go back to any point in time.

It’s almost like a time machine, when it comes to putting back that mailbox, the SQL database, or Oracle database. Yet, it's masking a lot of the complexity. So the person restoring it may not be a DBA. They're going to be that jack of all trades who's responsible for the storage and maybe backup overall.
Gardner: John, in talking with Quest folks, I've heard them refer to a next-generation platform or approach, or a whole greater than the sum of the parts. How do you define next generation when it comes to data recovery in your view of the world?

New benefits

Maxwell: Well, without hyperbole, for us, our next generation is a new platform that we call NetVault Extended Architecture (XA), and this is a way to provide several benefits to our customers.

One is that with NetVault Extended Architecture we now are delivering a single user experience across products. So this gets into SMB-versus-enterprise for a customer that’s using maybe one of our point solutions for application or database recovery, providing that consistent look and feel, consistent approach. We have some customers that use multiple products. So with this, they now have a single pane of glass.

Also, it's important to offer a consistent means for administering and managing the backup and recovery process, because as we've been talking, why should a person have to have multiple skill sets? If you have one view, one console into data protection, that’s going to make your life a lot easier than have to learn a bunch of other types of solutions.

That’s the immediate benefit that I think people see. What NetVault Extended Architecture encompasses under the covers, though, is a really different approach in the industry, which is modularization of a lot of the components to backup and recovery and making them plug and play.

Let me give you an example. With the increase in virtualization a lot of people just equate virtualization with VMware. Well, we've got Hyper-V. We have initiatives from Red Hat. We have Xen, Oracle, and others. Jerome, I'm kind of curious about your views, but just as we saw in the 90s and in the 00s, with people having multiple platforms, whether it's Windows and Linux or Windows and Linux and, as you said, AIX, I believe we are going to start seeing multiple hypervisors.
It's important to offer a consistent means for administering and managing the backup and recovery process


So one of the approaches that NetVault Extended Architecture is going to bring us is a capability to offer a consistent approach to multiple hypervisors, meaning it could be a combination of VMware and Microsoft Hyper-V and maybe even KVM from Red Hat.

But, again, the administrator, the person who is managing the backup and recovery, doesn’t have to know any one of those platforms. That’s all hidden from them. In fact, if they want to restore data from one of those hypervisors, say restore a VMware as VMDK, which is their volume in VMware speak, into what's called a VHD and a Hyper-V, they could do that.

That, to me, is really exciting, because this is exploiting these new platforms and environments and providing tools that simplify the process. But that’s going to be one of the many benefits of our new NetVault Extended Architecture next generation, where we can provide that singular experience for our customer base to have a faster go-to-market, faster time to market, with new solutions, and be able to deliver in a modular approach.

Customers can choose what they need, whether they're an SMB customer, or one of the largest customers that we have with hundreds of petabytes or exabytes of data.

Wendt: DCIG has a lot of conversations with managed-service providers, and you'd be surprised, but there are actually very few that are VMware shops. I find the vast majority are actually either Microsoft Hyper-V or using Red Hat Linux as their platform, because they're looking for a cost-effective way to deliver virtualization in their environments.

We've seen this huge growth in replication, and people want to implement disaster recovery plans or business continuity planning. I think this ability to recover across different hypervisors is going to become absolutely critical, maybe not today or tomorrow, but I would say in the new few years. People are going to say, "Okay, now that we've got our environment virtualized, we can recover locally, but how about recovering into the cloud or with a cloud service provider? What options do we have there?"

More choice

If they're using VMware and their provider isn’t, they're almost forced to use VMware or something like this, whereas your platform gives them much more choice for managed service providers that are using platforms other than VMware. It sounds like Quest will really give them the ability to backup VMware hypervisors and then potentially recover into Red Hat or Microsoft Hyper-V at MSPs. So that could be a really exciting development for Quest in that area.

Gardner: Jerome, do you have any use cases or examples that you're familiar with that illustrate this concept of next-generation and lifecycle approach to data recovery that we have been discussing?

Wendt: Well, it’s not an example, just a general trend I am seeing in products, because most of DCIG’s focus is just on analyzing the products themselves and comparing, traversing, and identifying general broader trends within those products.
Going forward, the industry is probably going to have to find a better way to refer to these products. Quest is a whole lot more than just running a backup.


There are two things we're seeing. One, we're struggling calling backup software backup software anymore, because it does so much more than that. You mentioned earlier about so much more intelligence in these products. We call it backup software, because that’s the context in which everyone understands it, but I think going forward, the industry is probably going to have to find a better way to refer to these products. Quest is a whole lot more than just running a backup.

And then second, people, as they view backup and how they manage their infrastructure, really have to go from this reactive, "Okay, today I am going to have to troubleshoot 15 backup jobs that failed overnight." Those days are over. And if they're not over, you need to be looking for new products that will get you over that hump, because you should no longer be troubleshooting failed backup jobs.

You should be really looking more toward, how you can make sure all your environment is protected, recoverable, and really moving to the next phase of doing disaster recoveries and business continuity planning. The products are there. They are mature and people should be moving down that path.

Crystal ball

Gardner: John, when we look into the crystal ball, even not that far out, it just seems that in order to manage what you need to do as a business, getting good control over your data, being able to ensure that it’s going to be available anytime, anywhere, regardless of the circumstances is, again, not a luxury, it’s not a nice to have. It’s really just going to support the viability of the business.

Maxwell: Absolutely. And what’s going to make it even more complex is going to be the cloud, because what's your control, as a business, over data that is hosted some place else?

I know that at Quest we use seven SaaS-based applications from various vendors, but what’s our guarantee that our data is protected there? I can tell you that a lot of these SaaS-based companies or hosting companies may offer an environment that says, "We're always up," or "We have a higher level of availability," but most recovery is based on logical corruption of data.

As I said, with some of these smaller vendors, you wonder about what if they went out of business, because I have heard stories of small service providers closing the doors, and you say, "But my data is there."

So the cloud is really exciting, in that we're looking at how we're going to protect assets that may be off-premise to your environment and how we can ensure that you can recover that data, in case that provider is not available.

Then there's something that Jerome touched upon, which is that the cloud is going to offer so many opportunities, the one that I am most excited about is using the cloud for failover. That really getting beyond recovery into business continuity.
Not only can we recover your data within seconds, but we can get your business back up and running, from an IT perspective, faster than you probably ever presumed that you could.


And something that has only been afforded by the largest enterprises, Global 1000 type customers, is the ability to have a stand up center, a SunGard or someone like that, which is very costly and not within reach of most customers. But with virtualization and with the cloud, there's a concept that I think we're going to see become very mainstream over the next five years, which is failover recovery to the cloud. That's something that’s going to be within reach of even SMB customers, and that’s really more of a business continuity message.

So now we're stepping up even more. We're now saying, "Not only can we recover your data within seconds, but we can get your business back up and running, from an IT perspective, faster than you probably ever presumed that you could."
Listen to the podcast. Find it on iTunes/iPod. Read a full transcript or download a copy. Sponsor: Quest Software.
You may also be interested in:

No comments:

Post a Comment