Skip Links

Upstarts speed past BI vendors in data loading speeds

By Eric Lai, Computerworld
March 19, 2009 10:00 AM ET

Computerworld - One of the more prosaic parts of data warehousing is, well, getting the data into the warehouse.

This has long been handled by vendors that are expert in the field of extract, transform and load (ETL). Even there, innovation focused more on the problem of transforming the data. Loading the data seemed a piece of cake by comparison.

That is, until business intelligence (BI) and analytics started becoming a round-the-clock affair. Also, today's biggest BI users -- banks, telecommunications providers, Web advertisers -- operate data warehouses larger than a petabyte in size and import huge swaths of data -- 50TB of data per day, as in the case of one of Teradata Inc.'s customers.

BI and ETL vendors are responding. The past several months have seen a number of start-ups and lesser-known firms touting screaming-fast data-loading speeds, both in the lab and in the field.

Database start-up Greenplum Inc. said it has a customer routinely loading 2TB of data in half an hour, for an effective throughput of 4TB per hour.

Rival database start-up Aster Data Systems Inc. claimed that its nCluster technology can enable customers to reach almost 4TB (specifically, 3.6TB) per hour.

Data-integration vendor Syncsort Inc. said third-party-validated lab tests show its software can load 5.4TB of data into a Vertica Systems Inc. columnar data warehouse in under an hour.

Not to be outdone, semantic data integration start-up Expressor Software Corp. claimed that in-house tests show its data-processing engine able to scale to nearly 11TB per hour.

"If they are really performing at this rate, it's quite significant and really impressive," said Jim Kobielus, an analyst at Forrester Research Inc., since "anything above a terabyte per hour is good."

Blazing past the incumbent BI and ETL vendors

What about the established firms? SAS Institute and Sun two years ago demonstrated a SAS data warehouse running on Sun Microsystems hardware with StorageTek arrays that pushed through 1.7TB in 17 minutes, or the equivalent of nearly 6TB per hour.

But apart from SAS, other big-name vendors have posted data-integration performance benchmarks that fall well short of these upstarts.

Three years ago, Informatica claimed its PowerCenter 8 software loaded data at a rate of 1.33 TB per hour. The company, which decline to comment today, hasn't posted any updated performance benchmarks.

Oracle and Hewlett-Packard last fall released the BI-oriented HP Oracle Database Machine, which they said loads data at up to 1TB per hour.

Microsoft claimed at the launch of SQL Server 2008 a year ago that its SQL Server Integration Services 2008 had loaded the equivalent of 2.36TB in an hour.

So how do they do it?

Most of these faster data integrators rely on the same basic secret sauce: software that shreds the data to be loaded before it is delivered via fast networks to dozens of data warehousing servers or more running in a massively parallel grid.

That's how Greenplum's "scatter-gather streaming" technology works. The company's customer, Fox Interactive Media Inc., operator of MySpace.com, can load 2TB of Web usage data in half an hour into its 200TB Greenplum data warehouse, according to Ben Werther, director of product management at Greenplum.

Our Commenting Policies
Latest News
rssRss Feed
View more Latest News