Google plans to add its next-generation Tensor Processing Units to its Google Compute Engine as way of increasing the power of machine learning.

Scott Ferguson, Managing Editor, Light Reading

May 18, 2017

3 Min Read
Google's TPU Chips Beef Up Machine Learning

Google is offering more powerful hardware for its machine learning cloud customers as it launches its next generation of Tensor Processing Unit (TPU) chips through its Google Compute Engine.

The company announced these second-generation TPUs at Google I/O this week, and Jeff Dean, a Google Senior Fellow, and Urs Hölzle, senior vice president for technical infrastructure, wrote a detailed blog post on May 17 about these chips.

Officially called Cloud TPUs, these chips can work with Intel's Skylake processors, as well as Nvidia's GPUs. The new generation of TPUs offer 180 teraflops of floating-point performance, and Google has designed these chips so that they can be stacked in what the company calls a TPU pod, which houses 64 TPUs and can provide up to 11.5 petaflops of compute power.

Taken together, these TPUs are designed to accelerate machine learning, while giving customers access to the power of the Google's cloud, which can lower the barrier to entry for many businesses trying to build machine learning and artificial intelligence (AI) applications.

Figure 1: A TPU pod (Source: Google) A TPU pod (Source: Google)

As Dean and Hölzle wrote in their blog:

"Our goal is to help you build the best possible machine learning systems from top to bottom. While Cloud TPUs will benefit many ML applications, we remain committed to offering a wide range of hardware on Google Cloud so you can choose the accelerators that best fit your particular use case at any given time."

Google has made machine learning and AI an essential building block of new products, including using the technology to improve search results, as well as in the development of its DeepMind Alpha Go program.

M&A activity is turning the cloud upside down. Find out what you need to know in our special report: Mergers, Acquisitions & IPOs Are Rocking the Cloud.

In addition, Google is making 1,000 Cloud TPUs available to researchers through its TensorFlow Research Cloud for free. The one catch is that anyone conducting machine learning or AI research would have to share their findings in a scientific journal or make them available through open source.

Google has been busy with cloud announcements this week. It pushed its Cloud Spanner database into general availability and released its Cloud IoT Core, a new service designed for businesses to help them manage all the data collected through Internet of Things devices. (See Google Cloud Spanner Hits General Availability.)

Related posts:

— Scott Ferguson, Editor, Enterprise Cloud News. Follow him on Twitter @sferguson_LR.

About the Author(s)

Scott Ferguson

Managing Editor, Light Reading

Prior to joining Enterprise Cloud News, he was director of audience development for InformationWeek, where he oversaw the publications' newsletters, editorial content, email and content marketing initiatives. Before that, he served as editor-in-chief of eWEEK, overseeing both the website and the print edition of the magazine. For more than a decade, Scott has covered the IT enterprise industry with a focus on cloud computing, datacenter technologies, virtualization, IoT and microprocessors, as well as PCs and mobile. Before covering tech, he was a staff writer at the Asbury Park Press and the Herald News, both located in New Jersey. Scott has degrees in journalism and history from William Paterson University, and is based in Greater New York.

Subscribe and receive the latest news from the industry.
Join 62,000+ members. Yes it's completely free.

You May Also Like