A new look into Google's data centers shows extensive security measures and the destruction of old hard drives to prevent leakage of customer data.
Google is shedding some of the secrecy around its data center practices, with a new video that shows extensive security measures and the destruction of old hard drives to prevent leakage of customer data.
Google "rigorously tracks the location and status" of each hard drive, destroying failed hard drives with a multistep process before gathering the mangled bits in boxes to send off to recycling centers.
LIFE AT GOOGLE: Visiting the Googleplex
"One device that is used to destroy old hard drives is known as the crusher," the narrator of a Google video says. "A steel piston is pushed through the center of the drive and the platters are deformed, making them unreadable."
Next, the video shows a powerful shredder spewing out pieces of drives used to hold customer data. "As you can see no one will be likely to get Google's customer data from these drives," the narrator states. Next, we see a half-dozen or so boxes filled with shredded pieces of former hard drives, ready to be shipped off to recycling centers.
Google, of course, has been under fire for collecting and storing private data, including search records and location data from Android phones. The video makes clear that it's unlikely Google will ever lose data it intends to store.
Google was operating more than 30 data centers in the United States and overseas as of 2008, according to an article in Data Center Knowledge. Google's new video shows the operations of one, in Hamina, Finland, and describes practices used broadly across all of Google's data centers. However, Google said there are "additional safeguards that we do not disclose publicly."
Each data center has "thousands upon thousands of machines" serving up search results, e-commerce transactions, or services for Google Apps customers. Each server is custom-made by Google with a stripped-down version of Linux, holding only the necessary systems and hardware to do its specific job, reducing the risk of vulnerabilities.
Google said all customer data is "stored in multiple locations to help ensure reliability. ... The files that store data are given random file names and are not stored in clear text, so they're not humanly readable."
After old drives are destroyed, Google says it retains extra backups on tape drives, providing "a level of redundancy to help safeguard its customers' data."
The tape storage came in handy a couple months ago after a Gmail outage that deleted email from thousands of accounts.
Google data centers connect to the Internet with multiple, redundant high-speed fiber optic cables to protect against failure, and have emergency backup generators in case of power outages. Customer data access automatically shifts form one data center to another in the event of fire.
The issue with Amazon was a different one, though. Amazon's outage stemmed from what it called a "networking event" that "triggered a large amount of re-mirroring" of storage volumes, creating a capacity shortage and taking virtual machines offline.
The Google video places much of its emphasis on physical security measures. Access to data center locations is tightly controlled, with no public tours or site visits. Cars are verified upon entry at a checkpoint manned around the clock, while difficult-to-forge badges are used for access inside the buildings. Some data centers even use iris scans to verify employees' identities.
Automated video analytics detect anomalies and alert security staff, and some data centers use "sophisticated thermal imaging cameras" to identify potential intruders by their heat signatures. Google security staff use carts, jeeps and scooters to respond to problems and maintain relationships with local law enforcement in case police backup is needed.