Dissatisfied with chips on the market, Google makes a video-transcoding chip to send better quality YouTube videos. Credit: YouTube TV You know Google has more money than it could ever spend when it invests in a custom chip to do one task. And now they’ve done it for the third time. The search giant has developed a new chip and deployed it in its data centers to compress video content. The chips, called Video (Trans)Coding Units, or VCUs, do that faster and more efficiently than traditional CPUs. In a blog post discussing the project, Jeff Calow, a lead software engineer at Google said the VCU gives the highest YouTube video quality possible on your device while consuming less bandwidth than before. “An important thing to understand is that video is created and uploaded in a single format, but will ultimately be consumed on different devices—from your phone to your TV—at different resolutions,” he wrote. Some viewers will be streaming to a 4k TV at home and others watching on their phone. The infrastructure team’s job is to get those videos ready to watch by sending the smallest amount of data to your chosen device with the highest possible quality video. “But it’s costly and slow, and doing that processing using regular computer “brains” (called CPUs) is pretty inefficient, especially as you add more and more videos,” he wrote. Google claims the VCU is 20 to 33 times more compute-efficient than Google’s previous optimized system, which ran on traditional x86 servers. The project has been in the works since 2015. YouTube saw consumers wanted higher quality video, but had to shift to more efficient video codecs in order to do so. The VP9 codec fits the bill, but it requires five times more compute resources than the older H.264 codec, Callow said. The VCU supports both of them, and the next generation VCU will support AV1, an advanced codec with even higher resolution than VC9. “A dedicated, hard-wired processor is always going to be the fastest. Transcoding is one of those operations that doesn’t change much, so a programmable device isn’t needed,” said Jon Peddie, president of Jon Peddie Research, who follows the graphics market. But he doubts developers will be able to use it like they do with Google’s Tensor Processing Unit (TPU) for AI. “I don’t know for sure, but I’d say its made by YouTube for YouTube and only YouTube. The support and documentation issues would not be worth it to them and it would only arm a competitor,” he said. So for now, enjoy those pretty 4k YouTube videos. And for those keeping track, the VCU is the third custom data center chip Google has designed. Before that came the Tensor Processing Unit (TPU) for AI workloads and the Titan chip for security. Related content news Omdia: AI boosts server spending but unit sales still plunge A rush to build AI capacity using expensive coprocessors is jacking up the prices of servers, says research firm Omdia. By Andy Patrizio Dec 04, 2023 4 mins CPUs and Processors Generative AI Data Center news AWS and Nvidia partner on Project Ceiba, a GPU-powered AI supercomputer The companies are extending their AI partnership, and one key initiative is a supercomputer that will be integrated with AWS services and used by Nvidia’s own R&D teams. By Andy Patrizio Nov 30, 2023 3 mins CPUs and Processors Generative AI Supercomputers news VMware stung by defections and layoffs after Broadcom close Layoffs and executive departures are expected after an acquisition, but there's also concern about VMware customer retention. By Andy Patrizio Nov 30, 2023 3 mins Virtualization Data Center Industry news AI partly to blame for spike in data center costs Low vacancies and the cost of AI have driven up colocation fees by 15%, DatacenterHawk reports. By Andy Patrizio Nov 27, 2023 4 mins Generative AI Data Center Podcasts Videos Resources Events NEWSLETTERS Newsletter Promo Module Test Description for newsletter promo module. Please enter a valid email address Subscribe