Our methodology for testing the various desktop management applications.We installed each product's desktop\u00a0management\u00a0console and server software on a Compaq DL380 server (dual 1.33-GHz processors, 512M bytes of RAM and 36G-byte disk). In every case we used Microsoft Windows 2000 Server with Service Pack 3 installed. For those products requiring a database backend we used Microsoft SQL Server 2000.The network consists of 10 Compaq workstations of varying processor speeds, running Windows 2000, XP and 98. We also used a Compaq Evo N1000v laptop with built-in 802.11b\u00a0wireless\u00a0networking; a high-end PClaptops E-Pro Max 585 laptop with a 2.8-GHz processor, 1G byte of memory and 64M bytes of video; and a Compaq iPAQ 3865 Pocket PC connected to the network remotely to test how the products deal with mobile devices. The network also contains two 10\/100 16-port switches, and a DSL connection to the Internet with a four-port switch and wireless 802.11b access point.In order to test the software inventory portion we loaded various versions of Internet Explorer and Microsoft Office to see if the software could detect the differences. Versions that we used included Internet Explorer 5.0, 5.5 and 6.0 with different service packs and encryption levels installed. For Microsoft Office we used Office 2000 Standard Edition, Office 2000 Small Business Edition and Office XP Standard Edition. For operating systems we used Windows 98 Second Edition, Windows 2000 Professional with Service Pack 2 and SP Service Pack 3, and Windows XP with and without Service Pack 1.For the software distribution task we looked at how difficult it was to create a distribution package for a simple program as well as what capabilities the software had for pushing software updates such as service packs and hot fixes. We paid particular attention to the ability to control how those updates were sent to the workstations based on need - meaning if they already had the update the software shouldn\u2019t send it.In the mobile management arena we looked at the various features and how well the product worked with different devices. For remote control we tried to judge how well the product worked in terms of jerky screen movement, slow cursor movement and the overall feel of controlling a remote computer.We also examined each vendors' unique features, which included asset management and software metering. Documentation provided by each vendor was also examined and scored.