You’re reading 9to5Google — experts who break news about Google and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Google on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

During the I/O 2018 Keynote, Google announced the release of its new TPU 3.0 chip.

Following a string of demonstrations of the power of machine learning, including Smart Compose for Gmail and intelligent editing suggestions in Google Photos, Sundar Pichai took a moment to put the spotlight on the new chip that brings these possibilities to life. ‘Tensor processing units’ or TPUs are the hardware components that enable most of Google’s AI and machine learning capabilities, including AlphaGo.

These TPUs can be leased to developers though Google Cloud. The third-generation Tensor processing unit pod has 8 times the performance of the previous generation, with speeds reaching 100 petaflops. The new chips are so powerful that Google had to add liquid cooling to their data centers to compensate for the additional heat.

FTC: We use income earning auto affiliate links. More.

About the Author

Kyle Bradshaw's favorite gear