The IT team at Weather Underground, a Web-based weather service, deals with internal traffic storms when bad conditions hit, including Hurricane Katrina.
More All-Stars The Weather Underground | Freeze.com | Intellidyn | Reliant Energy | The Weather Channel | United Way of the Texas Gulf Coast |
Besting the storm
Weather Underground uses Linux clusters and low-cost bandwidth to keep from being pelted by traffic storms.
|
For storm chasers, the excitement is in facing Mother Nature's wrath and coming out unscathed. The paradigm is similar for the IT team at Weather Underground, a Web-based weather service that experiences internal storms whenever bad conditions hit. For example, the number of page views on Weather Underground topped more than 14 million - triple the average - on Aug. 29, the day Katrina hit New Orleans. With users requesting animated radar maps that can be as large as 800K bytes, bandwidth requirements surged to more than five times the average, says Chris Schwerzler, IT operations manager at the San Francisco company.
To survive such traffic storms, the IT team devised a flexible infrastructure that can scale easily and deliver cost-effective bandwidth-intensive data as needed. Computing clusters built out of low-end, Intel-based servers running Linux give the Weather Underground processing oomph, while connectivity from Cogent provides big bandwidth at reasonable cost. Without this infrastructure, Weather Underground would not be able to deliver high-quality weather content and still be profitable, Schwerzler says.
"It would be hard to advertise and break even for the cost of the bandwidth," he says.
Weather Underground's efforts to create an inexpensive, scalable infrastructure to deliver weather data via the Web began in early 2000, coincident with a National Weather Service (NWS) policy change. Rather than restrict availability of that information to a few companies through private contracts, the NWS made the data widely accessible. "There was suddenly the availability of a lot of interesting radar data, and it was our project to develop an infrastructure that could scale and bring that data to a large number of end users," Schwerzler says.
With this 2005 Enterprise All-Star Award, Weather Underground earns recognition for creatively using network technology to capitalize on a new business opportunity.
From the sky to the Web
To take advantage of the newly available weather data, Weather Underground installed a large satellite dish at its headquarters in San Francisco. The dish receives weather data retransmitted by the National Oceanic and Atmospheric Administration from 144 NWS field offices. It uses the Digital Video Broadcast-Satellite standard to convert the satellite data into packets, which travel via fiber to the Linux clusters running in the company's data center, also in San Francisco. Weather Underground uses the User Datagram Protocol to broadcast the data to what Schwerzler calls radar servers.
"There you have a very raw product sitting on the machines," he says.
When a user requests a weather map, for example, a custom application running on an Apache Web server packages the raw data and sends it out via the Cogent network in a customized form the user can understand.
Because it sends the raw data to the end nodes in the cluster for packaging, rather than packaging the data before it receives a request, Weather Underground can deliver weather information faster than the NWS, Schwerzler says.
Weather Underground’s All-Star project leader Chris Schwerzler |
"If you sit side-by-side with its radar and our radar and hit reload on both, you'll see ours will update sometimes 30 seconds before theirs," he says.
Schwerzler wouldn't specify the number of clusters at Weather Underground but says each cluster ranges in size from eight to 40 servers, and hundreds of servers are in operation. Gigabit over copper provides connectivity for the clustered servers. The majority of the servers are in a Weather Underground data center, although the company recently began collocating some servers and connecting them to its data center via fiber.
No slacking off
Standardizing on Linux wasn't a big leap, as Schwerzler and his team had worked with Unix while at the University of Michigan (U-M). Plus, a free operating system made sense for a non-funded, private company running on a tight budget.
"Linux has appeal since you can replicate it 100, 200 times and not have to pay additional site licenses," Schwerzler says. "Plus, you have the ability to tweak and tune at the finest levels that you can't really do with some of the other flavors of Unix."
Weather Underground uses a trimmed-down version of Slackware, one of the only distributions of Linux available in the mid-1990s when Schwerzler was at U-M. In addition, the servers have no hard drives and support Preboot Execution Environment boot, which means they are configurable and deployable via the network.
With that setup, Weather Underground can easily add and retask servers as needed, Schwerlzer says.
"We can say, 'OK, you're no longer a radar server; you're now a frontline Web server, or you're now serving our icons,' and reboot the machine, and 30 seconds later you have a new setup," he says.
When Katrina hit, for example, the IT team quickly brought on 48 RackLogic servers it had on hand.
"We couldn't possibly have gotten the servers up that we needed to handle the [Katrina] demand if we hadn't devised this system of being able to quickly replicate our servers," Schwerzler says. "We basically take the machines as they are delivered to us, screw them into the rack, cable them and turn them on with a little bit of configuration. They're up in 30 seconds."
Thanks to the additional nodes, Weather Underground handled the load without a hitch.