• United States

Q&A: Legato looks ahead to future with EMC

Aug 11, 200311 mins
Backup and RecoveryData Center

When EMC last month agreed to acquire Legato Systems for $1.3 billion, it was hoping to get a bigger footprint in the storage software market. Legato’s established storage management and Networker backup and recovery software seemed like a good fit. Legato CTO George Symons spoke recently with Senior Editor Deni Connor and Executive Editor Bob Brown of Network World about how the merger plans are progressing, and how Symons sees the future of Legato, now that it will be operated as a software division of EMC.

When EMC last month agreed to acquire Legato Systems for $1.3 billion, it was hoping to get a bigger footprint in the storage software market. Legato’s established storage management and Networker backup and recovery software seemed like a good fit. Legato CTO George Symons spoke recently with Senior Editor Deni Connor and Executive Editor Bob Brown of Network World about how the merger plans are progressing, and how Symons sees the future of Legato, now that it will be operated as a software division of EMC.

They discussed:Integration since the acquisition announcementInformation lifecycle managementIntelligent data movement and virtualizationE-mail archiving and litigation supportDisaster recovery since 9/11






Where are Legato and EMC in the process of integrating their companies?

We are talking a lot about information lifecycle management – this is a major part of the fit between Legato and EMC as we bring things together. Things are progressing. The filings got done on time for an early October close. We are doing integration planning now. There’s a limit that the regulations allow us to do at this stage. There are teams in place with a steering committee driving it.

What are the goals of those teams?

The real goal is to have an effective combined organization on Day One. Beyond that we are looking at the acquisition as a way to grow revenue vs. get cost synergies. Clearly there will be cost synergies in the back office. So all of our planning is around that. Even though Legato will be a separate division, there will be dotted-line responsibility to Mark Lewis, CTO [of EMC] and executive vice president of [the company’s] open software organization. The reason for doing this is that to have an integrated solution, you need to pull the engineering organizations together.

How do you plan to integrate Legato Networker and EMC’s Data Manager (EDM) product?

The plan right now is take the functionality from EDM that is specific to backing up EMC’s Symmetrix and migrating that into Networker, so Networker will be the long-term direction for the company. Clearly, there are some very large companies using EDM, so we aren’t going to ‘end-of-life’ it. It will continue to be supported, but we will offer a migration path to Networker. EDM [doesn’t have] a large installed base, but some of EMC’s biggest, most important customers [use it].

EMC has put in place an open software organization under Lewis. Will there be a separate EMC software division also, longer term?

Most of the software in the open software organization falls into the category of infrastructure software – things that are closely tied to the operating system like dual-path support in PowerPath, some of the replication technologies and volume management and virtualization. The Legato division will focus more on data management – backup and recovery, media management, archiving, replication management, content and hierarchical storage management. There is some overlap. Some of the Legato products like High Availability may move into the open software organization.

How do you plan to keep Legato neutral – without a hardware agenda – now that it is part of EMC?

That’s one of the important reasons initially for us to be a division of EMC. If you look at it, EMC needs to work in both directions, and it’s important that the data management software be heterogeneous. If it’s not heterogeneous, it doesn’t add enough value. The customers aren’t homogenous, so application software needs to be heterogeneous. Just like on the hardware side, EMC’s products aren’t going to support just Legato-branded products. There needs to be a bit of a wall in the corporation so partnering can happen on both sides. The Legato team would be much happier if EMC didn’t work with KVS on message archiving because we have OTG.

Perhaps we can talk about information lifecycle management (ILM). Chairman Joe Tucci of EMC says that is EMC’s biggest next goal, and yet with Legato operating as a separate division, you say that is Legato’s goal also. How does Legato view ILM differently from EMC, and how do you view it the same?

Information lifecycle management just about covers everything once things are all well and done. There are some key technologies that need to be developed, particularly in the area of integrating metadata, that will drive the market. If you think about having to track all your data throughout its lifecycle, whether you are replicating data, backing it up or archiving it, what you need to be able to do is understand where all the copies of data reside. Some may be on tape, local disks or remote. You need to have this common metadata because many different applications are going to be moving it around. You need to know where it is at all times, so you can recover it as fast as possible, but also when you get to the time when you can destroy the data, you can destroy it from all locations possible.

That’s probably the greatest common vision when you talk to about data lifecycle management where the two companies are coming together. We have to get this global view of the metadata to be able to manage information throughout its lifecycle. One of the things of bringing the company together that has expanded EMC’s view is Legato has had a very strong view that ILM needs to be anchored in the application and knowledge of the data associated with that application.

A lot of vendors look at how do you move data around through its lifecycle, which is a component of it, but the core of managing information is understanding the value of data for that particular application. For example, what are the service-level objectives from a business point of view for that application? What are the recovery time objectives? What are the retrieval time objectives? Once you start looking at the value of that application to the business, then you can start planning how you manage the data through its lifecycle, and, of course, because that data changes over time, you need to understand that.

Will Legato and EMC be focusing their ILM efforts on different tiers of the market? For instance, will Legato be focusing on the small-medium enterprise or will you be going after the same market?

Let me answer that in two ways. At least today, ILM is a process and a strategy without a lot of deliverables behind it. There are lots of products that meet different phases of the lifecycle, but there isn’t integration around them. In terms of delivering ILM we are delivering a process and then delivering the individual components around that either us or partners supply.

If you look at the market EMC has been very successful with its direct sales force at the enterprise level; Legato much more with its channels and salespeople hitting what EMC calls the commercial market. And then there’s the small-medium enterprise where we’ve started to make some inroads in. Clearly from a channel perspective, and one of the reasons to keep Legato as a division, is the channels will continue to focus on their particular areas.

What will happen is there will be some bleed-over of products, so you will see Legato salespeople and the channel selling some of the EMC products, especially some of the SAN management, storage resource management technologies. You’ll see EMC pulling Legato into their accounts.

Virtualization or pooling of storage data so it can be managed more easily is a boon for storage. Do you envision that virtualization and added intelligence will take place at many places within the storage fabric?

Ignore the term virtualization for a second. There is absolutely going to be a hierarchy of intelligent devices that will aid in automation. I expect you will find backup moving closer to both the tape library and to the storage arrays. I expect you will find more and more intelligence in the network infrastructure, whether it lives on Cisco or Brocade devices. At the server level this will also exist. Based on policies and the value of the data, it will migrate through a host of intelligent devices. It’s generally policy-based storage automation. As virtualization is implemented today, it costs companies more than it saves them.

It’s implemented in so many different areas – server, array and switches – that it must be confusing for customers.

The ability to automatically migrate data or go to the appropriate device in a transparent manner will grow. The only thing I don’t know is where that virtualization will happen. There’s not a clear winner. One of the interesting questions is how do you back out of an environment like this and recover data if something goes wrong.

Let’s talk for a minute about Legato’s OTG e-mail archiving software. Are customers putting in e-mail archiving as a result of litigation or getting their businesses in place?

We’ve seen a lot of interest in litigation support from all industries. In most cases, companies are buying it to solve litigation problems, though we do see companies look at it to get prepared. You can run your existing message store through it to find the message you need. That’s the real issue, because it can cost anywhere from a couple hundred thousand to over a million dollars in a single case to search to the message store for the messages that have been subpoenaed.

Is that all disk-based or mounting tapes?

You can do it either way. What you are doing is bringing Exchange .PST files in and making them part of the message store – that is, if you can read the tape or, going forward, the changing versions of Exchange. We’ve had a customer who was able to recover the data just fine, but found that the current version of Exchange couldn’t read the data. The archive technology is independent, and we can bring it back. We are helping Microsoft with Exchange 5.5 to Exchange 2003 migration.

Is there a disconnect between what CEOs think they have in place for disaster recovery and what IT knows they have implemented? What is your experience?

Pre-9/11, CEOs universally thought they were protected, and the CIOs knew they weren’t. Post-9/11, the questions have gotten a lot tougher, and there are very few CEOs who don’t know the state of their businesses. Budgets have made it difficult to implement some of this. But they needed to clean up some of the basics, before they even started to worry about disaster recovery. They couldn’t even recover data locally, and so you needed to get that working before you started transitioning to other forms of disaster recovery. We saw a hierarchy of needs that put some of the things you would have expected on the back burner because of having to do your homework first.

So it’s more of a long-term process instead of something that was implemented after 9/11.

There was a tremendous amount of talk after 9/11, ‘we’re going to jump on disaster recovery,’ and budgets were not an issue. Then, reality started to set in. This isn’t easy. It takes resources, time and a basic infrastructure that can support it properly.

When did reality set in?

A year later.

So in the last year, we realized there was more to this than just putting in remote replication?

Exactly. Businesses have really started to prioritize by applications that are really critical to their businesses. They need to understand the order they bring things back, because they can’t support all applications on the same level.