What are the barriers to DLP? I’ve heard it can take a lot of time and the costs add up. Is there a way to get around this?It’s always daunting to consider deployment of a new security technology, but with the proper preparation Data Loss Prevention (DLP) is less painful to deploy than many of our other tools. The keys to a successful DLP deployment are setting the right expectations, proper planning during the selection process, and a controlled roll-out.The most common failure in deploying DLP is failing to set appropriate expectations before and during the selection process. Some organizations jump into DLP before they know exactly how they would like to use the technology. They either make a snap purchase to close a single existing problem, or they focus on technology-driven expectations that do not reflect business needs. Before you start product evaluations, pull together a team of major stakeholders – including IT, IT security, legal, risk, compliance, HR, and major business units. Determine what data you want to protect, and the degree of protection you would like.For example, one organization may focus on detecting credit card numbers on the network, in storage, and on endpoints and on preventing any transfer outside the enterprise. Another organization might be more interested in protecting unstructured sensitive engineering plans, with an emphasis on detection rather than enforcement. Make a prioritized list of the content you want to protect, and map it to a list of desired protective actions. Follow this with a rough outline of workflow so you know which users, technical and non-technical, will need to use the system. This feeds directly into your product selection requirements. Armed with knowledge of what you would like to protect and how, you then inventory your existing infrastructure and integration requirements. You do not need a detailed assessment of every little device on the network, just an overview of major gateways, data repositories, and endpoint management infrastructure. The combination of your data protection, workflow, and infrastructure integration requirements will form the heart of your product requirements. You will know what kind of content analysis techniques to focus on, whether you will need a point solution (e.g. an endpoint-only solution) or a complete DLP suite, and what your key workflow requirements are.After selecting a DLP solution, you can stage its roll-out to minimize costly errors. Most DLP customers find they start with passive network monitoring and a basic rules set. They tune the policies until they are happy with the results, then move into active blocking and data-at-rest scanning. The last frontier is typically endpoints, which are deployed on a workgroup basis (where organizations typically use passive alerts, not active blocking). This again lets you tune rules, understand potential business impact, and properly plan for capacity. Your process, from selection to deployment, has the greatest impact on reducing costs. My final recommendation is to give preference to full-suite vendors or to partners whose products are fully integrated. Point solutions force you to create different policies in different systems and then manage incidents on different management consoles that do not necessarily integrate. This situation quickly increases the cost and complexity of a DLP solution.Rich Mogull is a security consultant; read more about him at securosis.com. Have an insider-thread question? Drop us a line. Related content news AWS launches Cost Optimization Hub to help curb cloud expenses At its ongoing re:Invent 2023 conference, the cloud service provider introduced several new and free updates that are expected to help enterprises optimize their AWS costs. By Anirban Ghoshal Nov 28, 2023 3 mins Amazon re:Invent Events Industry how-to Getting started on the Linux (or Unix) command line, Part 4 Pipes, aliases and scripts make Linux so much easier to use. By Sandra Henry-Stocker Nov 27, 2023 4 mins Linux news AI partly to blame for spike in data center costs Low vacancies and the cost of AI have driven up colocation fees by 15%, DatacenterHawk reports. By Andy Patrizio Nov 27, 2023 4 mins Generative AI Data Center news Nvidia’s made-for-China chip delayed due to integration issues: Report Nvidia’s AI-focused H20 GPUs bypass US restrictions on China’s silicon access, including limits on-chip performance and density. By Sam Reynolds Nov 24, 2023 4 mins CPUs and Processors Generative AI Technology Industry Podcasts Videos Resources Events NEWSLETTERS Newsletter Promo Module Test Description for newsletter promo module. Please enter a valid email address Subscribe