• United States

The heat is on

Sep 27, 20049 mins
Backup and RecoveryHIPAAMessaging Apps

These four technologies are sparking renewed industry attention.

These three technologies are sparking renewed industry attention.

Message archiving

If you were to stand at the intersection of “electronic communication” and “corporate compliance” you would see a large uncovered manhole – the black hole of “message retention.” Most corporations’ ability to sort, retain, locate and mine messages hasn’t caught up with regulatory requirements and business demands.

But that’s changing. Message archiving systems are going to be the next big thing in e-mail and instant messaging, if you believe word on the street. True, tools that catalog e-mails have been around for a while – mostly used for e-mail management and archiving to lower-cost media, and occasionally to mine content for knowledge management. However, corporations will adopt more advanced electronic communication archiving and indexing tools en masse, and soon.

“The worldwide message archiving market will grow from $197 million in 2003, peaking in 2006 at $994 million, with a CAGR of 38%. There will then be a decline in revenues in 2007 to $917 million, continuing down to $660 million in 2008,” says Erica Rugullies, an analyst with Forrester Research. “Demand will grow . . . based on new regulations, more companies facing legal discovery issues and growing mailbox environment costs,” she says.

Thanks to Sarbanes-Oxley, the Health Insurance Portability and Accountability Act, and other government initiatives, corporations might be forced to locate messages by content and then hand them over for legal discovery. While most companies back up their e-mail stores, few index them at the same time. Fewer still have any system in place to back up IMs or to prevent users from deleting e-mails from personal in-boxes. Most e-mail management systems automatically delete messages by date, without concern to content.

A $2.7 million judgment against Philip Morris U.S.A. for deleting e-mail is being looked on by the messaging industry as a test case for e-mail retention liability. The judgment, issued in July by the U.S. District Court for the District of Columbia, fined Philip Morris for deleting e-mail more than two months old, according to court documents. In 1999, the court ordered Philip Morris to preserve “all documents and other records” that might contain information about a government case pending against the company, the motion said. In 2002, the routine deletions were discovered. While Philip Morris has protested the ruling, the message is clear – if a company is told to turn over messages pertaining to a specific topic, it had better be able to locate all of them.

Vendors see this as an incredible opportunity, and dozens are coming out with message-archiving tools and services. These include e-mail archiving specialists such as iLumin Software Services and Zantaz; document management service vendors such as Iron Mountain; enterprise content management firms such as Open Text (through its acquisition of e-mail archiving specialist IXOS Software), EMC (through its acquisitions of Documentum and Legato Systems), and IBM.

Such products keep e-mail and IMs stored centrally so they can be searched quickly. They also let messages be stored, managed and retrieved by content, among other identifiers. Compliance managers can use them to review people’s e-mail on a random basis, or per some sort of lexicon, to ensure compliance. A company can expect to pay between $25 and $100 per user, depending on volume and features.

Product information management

Does the IT world need yet another form of information management? When it comes to coping with structured and unstructured product information, apparently so. “Everyone and their brother is getting into” product information management (PIM), Rugullies says. Vendors include specialists such as Flow Systems, Full Degree, Full Tilt, Riversand Technologies, Stibo Catalog and Velosel, and larger players such as IBM and SAP. In May, IBM released WebSphere Product Center Version 5, the offspring of its Trigo Technologies acquisition. SAP countered in July by buying A2i, positioned in the nascent PIM industry as Trigo’s major rival. SAP says it aims to integrate A2i’s PIM features into its NetWeaver application platform for fall availability.

Other vendors clamoring to add PIM to their marketing spiel include supply-chain management companies such as i2 Technologies and JDA Software Group, with its June acquisition of QRS, and electronic data interchange, e-commerce and catalog vendors. Rugullies anticipates more mega-sized software vendors to enter the market via acquisitions.

So what’s the big deal about PIM – is it just another name for the enterprise content management? Not exactly, Rugullies says. These offerings are the next generation of the product catalog software spawned during the business-to-business mania days. They are unique from other content management systems in that they combine the ability to publish product information to online and printed media. They also function as a central repository for both structured data (such as stock numbers and item measurements) and unstructured data (information sheets, video clips, long descriptions).

For companies in need, PIM systems offer much to like. They can cull data from scores of sources, clean and standardize the data, designate gold sources of clean data, and easily make this information available to other sources through APIs, XML and the like, Rugullies says. The first wave of adopters are manufacturers and distributors – companies with hefty product catalogs. But financial services firms, such as credit card providers, also are expected to gravitate toward PIM. From there, other marketers of many highly customized products or services that produce both online and printed documentation likely will follow.

Grid middleware

The mainframe world is abuzz over the term “grid middleware.” But some experts note that the term is really just another name for grid software, as it previously has been called. All grid software is middleware, stitching together disparate CPUs so that they can operate as one supercomputer.

“Adoption of the term ‘grid middleware’ doesn’t signify a grand new development, but is simply a new name for an emerging concept,” says Ian Foster, professor, national lab scientist and author of The Grid: Blueprint for a New Computing Infrastructure.

The term grid middleware has been given cache lately because it has been carted out to label highly autonomous grid systems – which most grid systems are. People seem to better grasp what grid software is when the term “middleware” is attached.

“Anyone that has software that vaguely relates to distributed computing is calling it ‘grid’ and now ‘grid middleware’ . . . what is new is the sense of recognition that one needs a distinct set of grid software. And the work that’s going on to develop that software and the standards on which grids will be built is new, too,” Foster says.

While grid software is still an emerging technology and worthy of buzz, the truly exciting stuff is going on in the world of standards for interoperable grid middleware, Foster says. Some folks in the grid community are “taking concepts and standards from Web services and adding to them to support the issues that arise when performing resource sharing, federated sharing,” he says.

One such standards initiative, the Open Grid Services Architecture (OGSA), in which Foster is involved, is working on exactly that task. OGSA is writing a group of specifications that converge grid and Web services. The Web Services Resource Framework is one result, generated from an effort between the Globus Alliance, IBM, HP and others, with a first version released in April. The framework, which consists of a family of specifications, deals in large part with modifying accepted Web services protocols so that they maintain state. Grids require maintenance of state.

While such standards efforts are young, some vendors are announcing support. For instance, IBM’s Grid Toolbox V3 for Linux on xSeries supports OGSA standards, IBM says.

Wireless security

802.11i is finally here, and that’s a relief to many a network executive following the years-long buzz over Wi-Fi security issues. 802.11i incorporates the powerful Advanced Encryption Standard in 802.11x devices. Products based on the standard are expected to hit the market before year-end.

Corporations can expect vendors to go through the usual interoperability testing gyrations, common whenever many vendors flock to support a new standard. Earlier this month, the Wi-Fi Alliance began testing and certifying products for 802.11i compliance. The advanced encryption of 802.11i is expected to help enterprises deal with a variety of security threats, including “stumbler” programs that help Wi-Fi users detect nearby signals. While the standard won’t stop people from detecting the Wi-Fi network with stumblers, “if 802.11i proves to be secure, it won’t be of much use” as unauthorized users will not be able to gain access, says Allen Nogee, wireless technology analyst at In-Stat/MDR.

“A bigger dilemma is that, with so many different security standards out there, your network is only as secure as the least-secure card,” he says. “I don’t see a lot of enterprises jumping on 802.11i for existing equipment, although they will want their new products to be 11i.” Because Wi-Fi gear has become so affordable, many companies will be able to refresh their gear by buying anew, rather than upgrading firmware to the new standard.

Buzz on the wireless security scene also is centering around smarter gear. As wireless LANs (WLAN) gain in popularity, they have become a weak link for security concerns such as virus propagation. Vendors are addressing that by introducing access points that perform anti-virus scanning, Nogee says. Such products include Fortinet’s FortiWiFi 60, Symantec’s Gateway Security 300 Series (an all-in-one security device with an optional Wi-Fi access point module), SonicWall’s SOHO TZW with the addition of optional anti-virus software.

Controlling viruses on the WLAN “is not a bad strategy. If you can stop a virus at the access point, it doesn’t matter if the virus is coming in or going out, it’s still blocked. The problem is that otherwise, Wi-Fi allows users to connect to inside the network and never be bothered with the virus checking of the firewall used in the corporate perimeter,” Nogee says.

He further suggests that deploying Windows XP Service Pack 2, which includes updated security protections, can reduce the danger of wireless-spread viruses. SP2, released in August, is Microsoft’s endeavor to increase native security on XP clients. SP2 includes easier-to-configure wireless network connections, Microsoft says.

The next security issue that Nogee says he would like to see the Wi-Fi industry address is better authentication and identity management. Watch for more buzz on this front in 2005 as vendors mature their WLAN security tools.