Google Cloud now offers what it calls 'preemptible' GPUs to help accelerate batch computing and machine learning workloads. Google is also cutting prices.

Scott Ferguson, Managing Editor, Light Reading

January 5, 2018

2 Min Read
Google Cloud Offering 'Preemptible' GPUs Plus Price Cut

Google Cloud is looking to accelerate batch computing, machine learning and other high-throughput workloads with what the company calls "preemptible" Nvidia GPUs that cloud customers can use in 24-hour periods.

In addition, Google (Nasdaq: GOOG) is cutting the prices of these GPU resources as part of the beta release, according to a January 4 blog post.

Google is offering these preemptible GPUs to its cloud customers for a maximum time period of 24 hours, although Google warns that its Compute Engine can shut these processors down with only a 30-second notice.

This allows Google to offers the maximum amount of computing power to these types of workloads, while keeping the price down. To start, the company is offering Nvidia Corp. (Nasdaq: NVDA) K80 GPUs for $0.22 per hour and P100 GPUs for $0.73 per hour.

Figure 1: (Source: ECN) (Source: ECN)

"This is 50% cheaper than GPUs attached to on-demand instances, which we also recently lowered," Google Cloud product managers Chris Kleban and Michael Basilyan wrote in the blog post. "Preemptible GPUs will be a particularly good fit for large-scale machine learning and other computational batch workloads as customers can harness the power of GPUs to run distributed batch workloads at predictably affordable prices."

These preemptible GPUs follow Google's introduction of preemptible virtual machines (VMs) that offered powerful, but short-lived bursts of compute instances to help with machine learning, batch computing, research and other intensive workloads.

Keep up with the latest enterprise cloud news and insights. Sign up for the weekly Enterprise Cloud News newsletter.

Last year, Google added local solid state drives (SSDs) to these preemptible VMs to allow high-performance storage for these workloads.

Over the past several years, Google has used its Cloud Platform to expand the company's investments in machine learning and other technologies under the article intelligence umbrella. Enterprise Cloud News editor Mitch Wagner recently wrote about Google's plan for video analytics and other machine learning ambitions. (See Google & Amazon Heat Up Machine Learning Rivalry.)

Google also developed what it calls Tensor Processing Unit (TPU) chips to accelerate machine learning within the cloud. (See Google's TPU Chips Beef Up Machine Learning.)

Related posts:

— Scott Ferguson, Editor, Enterprise Cloud News. Follow him on Twitter @sferguson_LR.

About the Author(s)

Scott Ferguson

Managing Editor, Light Reading

Prior to joining Enterprise Cloud News, he was director of audience development for InformationWeek, where he oversaw the publications' newsletters, editorial content, email and content marketing initiatives. Before that, he served as editor-in-chief of eWEEK, overseeing both the website and the print edition of the magazine. For more than a decade, Scott has covered the IT enterprise industry with a focus on cloud computing, datacenter technologies, virtualization, IoT and microprocessors, as well as PCs and mobile. Before covering tech, he was a staff writer at the Asbury Park Press and the Herald News, both located in New Jersey. Scott has degrees in journalism and history from William Paterson University, and is based in Greater New York.

Subscribe and receive the latest news from the industry.
Join 62,000+ members. Yes it's completely free.

You May Also Like