When EMC last month agreed to acquire Legato Systems for $1.3 billion, it was hoping to get a bigger footprint in the storage software market. Legato\u2019s established storage management and Networker backup and recovery software seemed like a good fit. Legato CTO George Symons spoke recently with Senior Editor Deni Connor and Executive Editor Bob Brown of Network World about how the merger plans are progressing, and how Symons sees the future of Legato, now that it will be operated as a software division of EMC.When\u00a0EMC\u00a0last month\u00a0agreed to acquire\u00a0Legato Systems for $1.3 billion, it was hoping to get a bigger footprint in the storage software market. Legato\u2019s established storage management and Networker backup and recovery software seemed like a good fit. Legato CTO George Symons spoke recently with Senior Editor Deni Connor and Executive Editor Bob Brown of Network World about how the merger plans are progressing, and how Symons sees the future of Legato, now that it will be operated as a software division of EMC.They discussed:Integration since the acquisition announcementInformation lifecycle managementIntelligent data movement and virtualizationE-mail archiving and litigation supportDisaster recovery since 9\/11-\u00a0-\u00a0-\u00a0-\u00a0-\u00a0Where are Legato and EMC in the process of integrating their companies?We are talking a lot about information lifecycle management \u2013 this is a major part of the fit between Legato and EMC as we bring things together. Things are progressing. The filings got done on time for an early October close. We are doing integration planning now. There\u2019s a limit that the regulations allow us to do at this stage. There are teams in place with a steering committee driving it.What are the goals of those teams?The real goal is to have an effective combined organization on Day One. Beyond that we are looking at the acquisition as a way to grow revenue vs. get cost synergies. Clearly there will be cost synergies in the back office. So all of our planning is around that. Even though Legato will be a separate division, there will be dotted-line responsibility to Mark Lewis, CTO [of EMC] and executive vice president of [the company\u2019s] open software organization. The reason for doing this is that to have an integrated solution, you need to pull the engineering organizations together.How do you plan to integrate Legato Networker and EMC\u2019s Data Manager (EDM) product?The plan right now is take the functionality from EDM that is specific to backing up EMC\u2019s Symmetrix and migrating that into Networker, so Networker will be the long-term direction for the company. Clearly, there are some very large companies using EDM, so we aren\u2019t going to \u2018end-of-life\u2019 it. It will continue to be supported, but we will offer a migration path to Networker. EDM [doesn\u2019t have] a large installed base, but some of EMC\u2019s biggest, most important customers [use it].EMC has put in place an open software organization under Lewis. Will there be a separate EMC software division also, longer term?Most of the software in the open software organization falls into the category of infrastructure software \u2013 things that are closely tied to the operating system like dual-path support in PowerPath, some of the replication technologies and volume management and virtualization. The Legato division will focus more on data management \u2013 backup and recovery, media management, archiving, replication management, content and hierarchical storage management. There is some overlap. Some of the Legato products like High Availability may move into the open software organization.How do you plan to keep Legato neutral - without a hardware agenda - now that it is part of EMC?That\u2019s one of the important reasons initially for us to be a division of EMC. If you look at it, EMC needs to work in both directions, and it\u2019s important that the data management software be heterogeneous. If it\u2019s not heterogeneous, it doesn\u2019t add enough value. The customers aren\u2019t homogenous, so application software needs to be heterogeneous. Just like on the hardware side, EMC\u2019s products aren\u2019t going to support just Legato-branded products. There needs to be a bit of a wall in the corporation so partnering can happen on both sides. The Legato team would be much happier if EMC didn\u2019t work with KVS on message archiving because we have OTG.Perhaps we can talk about information lifecycle management (ILM). Chairman Joe Tucci of EMC says that is EMC\u2019s biggest next goal, and yet with Legato operating as a separate division, you say that is Legato\u2019s goal also. How does Legato view ILM differently from EMC, and how do you view it the same?Information lifecycle management just about covers everything once things are all well and done. There are some key technologies that need to be developed, particularly in the area of integrating metadata, that will drive the market. If you think about having to track all your data throughout its lifecycle, whether you are replicating data, backing it up or archiving it, what you need to be able to do is understand where all the copies of data reside. Some may be on tape, local disks or remote. You need to have this common metadata because many different applications are going to be moving it around. You need to know where it is at all times, so you can recover it as fast as possible, but also when you get to the time when you can destroy the data, you can destroy it from all locations possible.That's probably the greatest common vision when you talk to about data lifecycle management where the two companies are coming together. We have to get this global view of the metadata to be able to manage information throughout its lifecycle. One of the things of bringing the company together that has expanded EMC\u2019s view is Legato has had a very strong view that ILM needs to be anchored in the application and knowledge of the data associated with that application.A lot of vendors look at how do you move data around through its lifecycle, which is a component of it, but the core of managing information is understanding the value of data for that particular application. For example, what are the service-level objectives from a business point of view for that application? What are the recovery time objectives? What are the retrieval time objectives? Once you start looking at the value of that application to the business, then you can start planning how you manage the data through its lifecycle, and, of course, because that data changes over time, you need to understand that.Will Legato and EMC be focusing their ILM efforts on different tiers of the market? For instance, will Legato be focusing on the small-medium enterprise or will you be going after the same market?Let me answer that in two ways. At least today, ILM is a process and a strategy without a lot of deliverables behind it. There are lots of products that meet different phases of the lifecycle, but there isn\u2019t integration around them. In terms of delivering ILM we are delivering a process and then delivering the individual components around that either us or partners supply.If you look at the market EMC has been very successful with its direct sales force at the enterprise level; Legato much more with its channels and salespeople hitting what EMC calls the commercial market. And then there\u2019s the small-medium enterprise where we\u2019ve started to make some inroads in. Clearly from a channel perspective, and one of the reasons to keep Legato as a division, is the channels will continue to focus on their particular areas.What will happen is there will be some bleed-over of products, so you will see Legato salespeople and the channel selling some of the EMC products, especially some of the SAN management, storage resource management technologies. You\u2019ll see EMC pulling Legato into their accounts.Virtualization or pooling of storage data so it can be managed more easily is a boon for storage. Do you envision that virtualization and added intelligence will take place at many places within the storage fabric?Ignore the term virtualization for a second. There is absolutely going to be a hierarchy of intelligent devices that will aid in automation. I expect you will find backup moving closer to both the tape library and to the storage arrays. I expect you will find more and more intelligence in the network infrastructure, whether it lives on Cisco or Brocade devices. At the server level this will also exist. Based on policies and the value of the data, it will migrate through a host of intelligent devices. It\u2019s generally policy-based storage automation. As virtualization is implemented today, it costs companies more than it saves them.It\u2019s implemented in so many different areas - server, array and switches - that it must be confusing for customers.The ability to automatically migrate data or go to the appropriate device in a transparent manner will grow. The only thing I don\u2019t know is where that virtualization will happen. There\u2019s not a clear winner. One of the interesting questions is how do you back out of an environment like this and recover data if something goes wrong.Let\u2019s talk for a minute about Legato\u2019s OTG e-mail archiving software. Are customers putting in e-mail archiving as a result of litigation or getting their businesses in place?We\u2019ve seen a lot of interest in litigation support from all industries. In most cases, companies are buying it to solve litigation problems, though we do see companies look at it to get prepared. You can run your existing message store through it to find the message you need. That\u2019s the real issue, because it can cost anywhere from a couple hundred thousand to over a million dollars in a single case to search to the message store for the messages that have been subpoenaed.Is that all disk-based or mounting tapes?You can do it either way. What you are doing is bringing Exchange .PST files in and making them part of the message store - that is, if you can read the tape or, going forward, the changing versions of Exchange. We\u2019ve had a customer who was able to recover the data just fine, but found that the current version of Exchange couldn\u2019t read the data. The archive technology is independent, and we can bring it back. We are helping Microsoft with Exchange 5.5 to Exchange 2003 migration.Is there a disconnect between what CEOs think they have in place for disaster recovery and what IT knows they have implemented? What is your experience?Pre-9\/11, CEOs universally thought they were protected, and the CIOs knew they weren\u2019t. Post-9\/11, the questions have gotten a lot tougher, and there are very few CEOs who don\u2019t know the state of their businesses. Budgets have made it difficult to implement some of this. But they needed to clean up some of the basics, before they even started to worry about disaster recovery. They couldn\u2019t even recover data locally, and so you needed to get that working before you started transitioning to other forms of disaster recovery. We saw a hierarchy of needs that put some of the things you would have expected on the back burner because of having to do your homework first.So it\u2019s more of a long-term process instead of something that was implemented after 9\/11.There was a tremendous amount of talk after 9\/11, \u2018we\u2019re going to jump on disaster recovery,\u2019 and budgets were not an issue. Then, reality started to set in. This isn\u2019t easy. It takes resources, time and a basic infrastructure that can support it properly.When did reality set in?A year later.So in the last year, we realized there was more to this than just putting in remote replication?Exactly. Businesses have really started to prioritize by applications that are really critical to their businesses. They need to understand the order they bring things back, because they can\u2019t support all applications on the same level.