Lay a foundation to maximize the value from business-process outsourcing. A vice president of product strategy offers guide to preparing a network for business-process outsourcing.Wouldn’t it be nice if you could toss every scrap of paper related to your IRS tax return onto your accountant’s desk and let him figure out the mess? Alas, it doesn’t work that way. It’s incumbent on taxpayers to sort and categorize W-2s, 1099s and expense receipts, and assemble the information about capital gains, charitable contributions, IRA deductions and so on.If taxpayers don’t do this preparation before the accountant gets to work, they risk confusion, inaccuracies, missed deadlines and IRS penalties – for what boils down to bad data.It’s much the same in business-process outsourcing, when an organization turns over responsibility for a functional area such as payroll to a BPO provider. Companies are increasingly outsourcing complex processes – procurement, CRM, sales and marketing, and finance, for example – that directly affect their bottom line. BPO basicsThere’s much upfront work to do before you can hand responsibility for a process over to an outsourcer.1Data access and integration Establish connection to in-house sources.2Data quality Profile content and structure of data and correct errors.3Data delivery Determine optimal formula for moving data across network and firewall.4Security and administration Ensure security mechanisms and failover capabilities throughout process.5Management and control Track data lineage from start to finish to ensure 360-degree visibility. Because the data for these processes is scattered among a variety of systems (for example, SAP, Oracle databases, mainframes), it has different formats and semantics. The stakes become higher when companies send this mission-critical data across a firewall, particularly with the Sarbanes-Oxley Act and other regulations that demand auditing.To realize the greatest long-term value from BPO, organizations need to focus first on the foundation of BPO: mission-critical data. Data access and integrationThe first step is to establish non-invasive access to applications to enable data exchange with an outsourcer. Companies used to rely on custom-coding COBOL and PL/SQL for data access. A key reason that many have deployed data integration is that it reduces (often by 30%) the time and cost involved in such hand-coding. Data integration delivers prebuilt connectors to dozens of data sources and generates mappings that may be reused across multiple projects. It is also important for companies to evaluate the level of their dedicated Internet connection, because of the amount of data being moved and the immediacy with which that data will be required.One option available from some outsourcers is multitransport networking services, which support a wide range of port speeds, in addition to traditional private data-networking services such as Frame Relay and Ethernet over an MPLS-enabled IP backbone. Appropriate bandwidth levels and flexible network technologies will ensure that businesses remain agile and scalable to market demands as they grow and the amount of their data increases.Data qualityOnce a company has established its data-sourcing framework, it needs to make its network data consistent and accurate. Typically, data is “dirty,” with missing elements and microscopic contradictions that can magnify into big problems, particularly in BPO arrangements.First, companies should tackle their nitty-gritty data-quality issues with data profiling, which analyzes content, structure and quality. Next, data cleansing will standardize and validate data, remove duplicates, and correct problems, such as conflicting definitions of “product” or “customer.” Without this attention to data quality, BPO customers run the risk of the old computer axiom coming true: garbage in, garbage out.Data deliveryThe next step is data delivery – establishing the optimal formula for the speed, frequency, and volume of the bidirectional data movement between a BPO customer and provider. Customers have many delivery options, from traditional batch delivery to real-time feeds, and they need to determine which is best for them. One option is changed-data-capture technology, which can substantially reduce the volume of data by moving only updated data. Companies should also consider data compression, parallel processing and partitioning, which can be fine-tuned to accelerate speed.Data security and availabilitySecurity is a hot-button issue for BPO customers. They will want to fortify existing security mechanisms on both sides of their firewalls with built-in serial storage-architecture encryption, digital certificates and support for Lightweight Directory Access Protocol for authentication and authorization. Another customer concern is data availability; an interrupted data exchange between customer and BPO provider can result in costly downtime. Companies should consider a private data network between them and their outsourcers. Without a private network, companies risk compromising data transfer rates as well as data security. Many available private-line services not only guarantee certain levels of network availability and speed but also make networks self-healing.Data management and controlLast – but definitely not least – BPO customers should rigorously establish technologies and methodologies that ensure they have 360-degree visibility and control of their outsourced data. They should build their outsourcing relationship to ensure they are provided with self-generating metadata to track data “lineage” (who changed what and when) and equipped with robust auditing capabilities that satisfy regulatory requirements.There’s no doubt that BPO is big and getting bigger. Market researcher IDC predicts that worldwide BPO spending will experience a compound annual growth rate of 10.9% between 2004 and 2009, growing from $382.5 billion to $641.2 billion per year in those five years.Amid the buzz over BPO’s benefits – huge cost savings, improved agility and optimized performance – it’s easy to overlook the nuts-and-bolts data issues that have to be addressed upfront. Companies that take pains to prepare their network data for BPO, however, stand to reap rewards for years to come. It’s sort of like getting a windfall tax refund from the IRS because you had all your paperwork in order. Lyle is the vice president of product strategy at data integration software vendor Informatica. He can be reached at dlyle@informatica.com. Related content feature 5 ways to boost server efficiency Right-sizing workloads, upgrading to newer servers, and managing power consumption can help enterprises reach their data center sustainability goals. By Maria Korolov Dec 04, 2023 9 mins Green IT Servers Data Center news Omdia: AI boosts server spending but unit sales still plunge A rush to build AI capacity using expensive coprocessors is jacking up the prices of servers, says research firm Omdia. By Andy Patrizio Dec 04, 2023 4 mins CPUs and Processors Generative AI Data Center feature What is Ethernet? History, evolution and roadmap The Ethernet protocol connects LANs, WANs, Internet, cloud, IoT devices, Wi-Fi systems into one seamless global communications network. By John Breeden Dec 04, 2023 11 mins Networking news IBM unveils Heron quantum processor and new modular quantum computer IBM also shared its 10-year quantum computing roadmap, which prioritizes improvements in gate operations and error-correction capabilities. By Michael Cooney Dec 04, 2023 5 mins CPUs and Processors High-Performance Computing Data Center Podcasts Videos Resources Events NEWSLETTERS Newsletter Promo Module Test Description for newsletter promo module. Please enter a valid email address Subscribe