Google Cloud Next 2024 has begun, and the company made some big announcements at the event, including new Axion processors. This is Google’s first Arm-based CPU designed specifically for data centers, using Arm’s Neoverse V2 CPU design.
According to Google, Axion performs 30% better than the fastest Arm-based general-purpose tools in the cloud and 50% better than the latest comparable x86-based virtual machines. They also claim that it is 60% more energy efficient than the same x86-based virtual machine. Google is already using Axion in services such as BigTable and Google Earth Engine, and will expand to more services in the future.
The launch of Axion could pit Google against Amazon, which leads the field of Arm-based CPUs for data centers. The company’s cloud business, Amazon Web Services (AWS), released Graviton processors back in 2018, with second and third iterations released over the next two years. Chip developer NVIDIA released its first Arm-based data center CPU called Grace in 2021, and companies such as Ampere are also making progress in this field.
Google has been developing its own processors for years, but they have primarily focused on consumer products. The original Arm-based Tensor was first shipped in the Pixel 6 and 6 Pro smartphones released in late 2021. Subsequent Pixel phones are powered by newer versions of Tensor. Prior to this, Google developed a “Tensor Processing Unit” (TPU) for its data center. The company began using them internally in data centers in 2015, announced them publicly in 2016, and made them available to third parties in 2018.
Arm-based processors are often a lower-cost, more energy-efficient option. Ahead of Google’s announcement, Arms CEO Rene Haas warned about the energy use of artificial intelligence models. wall street journal. He called the power demands of models such as ChatGPT “insatiable.” “The more information they collect, the smarter they become, but the more information they collect and the smarter they become, the more power they need,” Haas said. “By the end of the century, AI data centers could consume up to 20 to 25 percent of electricity.” “Percent of U.S. electricity demand. Today it’s probably four percent or less. Honestly, that’s not very sustainable.” He stressed the need for greater efficiency to keep the pace of breakthroughs going.
This article contains affiliate links; if you click on such links and make a purchase, we may earn a commission.