
Google Unveils Specialized TPUs for AI Training and Inference
Google is launching a major hardware update for its eighth-generation Tensor Processing Unit (TPU), splitting its functionality into two specialized processors: one for AI model training and another for inference. This move aims to boost efficiency for the growing field of AI agents. The new chips boast significant performance gains, with the training unit offering 2.8 times the performance of the previous generation at the same cost. Industry adoption is accelerating, with major entities like Citadel Securities and all 17 U.S. Energy Department national labs already utilizing the technology. This development solidifies Google's position as a key provider of custom AI silicon alternatives.






















