Giving away 'Big Sur' AI hardware designs.

Mitch Wagner, Executive Editor, Light Reading

December 10, 2015

4 Min Read
Facebook Open Sources AI Servers

Facebook plans to open source its Big Sur AI hardware servers, and will submit the design to the Open Compute Project, Facebook said Thursday.

The Facebook AI Research (FAIR) team is more than tripling its investment in GPU hardware, as it looks to spread neural networks wider in Facebook's products and services, said Kevin Lee and Serkan Piantino, two engineers on Facebook's infrastructure team, in a blog post Thursday.

Recent advances in machine learning and artificial intelligence have been driven recently by larger publicly available research data and more powerful computers, specifically powered by GPU. "Faster hardware and software allows us to explore deeper and more complex systems," Lee and Piantino write.

Figure 1: "Big Sur" Open Rack V2 compatible 8-GPU server. (Photo source: Facebook.) Open Rack V2 compatible 8-GPU server. (Photo source: Facebook.)

"At Facebook, we've made great progress thus far with off-the-shelf infrastructure components and design," the two Facebook men say. "We've developed software that can read stories, answer questions about scenes, play games and even learn unspecified tasks through observing some examples. But we realized that truly tackling these problems at scale would require us to design our own systems. Today, we're unveiling our next-generation GPU-based systems for training neural networks, which we've code-named 'Big Sur.' "

Big Sur is compatible with Open Rack and "designed for AI computing at a large scale," according to the blog post. In addition to improved performance, the hardware is optimized for thermal and power efficiency, eliminating the need for special cooling and other unique infrastructure often required for high-performance computing. The servers can even be operated in Facebook's free-air-cooled Open Compute standard data centers.

Big Sur was designed for efficient operation and serviceability. The server eliminates little-used components. Components that fail relatively frequently, such as hard drives and DIMMs, can be removed and replaced in seconds.

Facebook has a strong track record of backing open source, and founded the Open Compute Project to drive open source designs into data center hardware, including the Wedge switch, and Yosemite, a system-on-a-chip compute server. (See Facebook Reinvents Data Center Networking and Facebook Releases Data Center Tech.)

Facebook is in an AI race with Google, which recently open sourced its TensorFlow AI software library.

Deep learning is useful for identifying images, recognizing spoken words and language translation -- just the thing Facebook, Google and other comms companies need to understand the oceans of unstructured content available on social networks and the general Internet.

Comms companies looking to make money from NFV, cloud and security service will need big data and analytics, along with implementation tools, says Heavy Reading analyst Jim Hodges. (See NFV, Cloud, Security & 5G Need Analytics – Heavy Reading and Customer Care Drives CSP Demand for Analytics – Report.)

AI, deep learning and neural nets are among the most powerful of those tools.

Find out more about key developments related to the systems and technologies deployed in data centers on Light Reading's Data Center Infrastructure Channel

What's unclear at this point is how companies like Google and Facebook plan to make money if they're giving away strategic technology. Asked how open sourcing Big Sur will help drive revenue and profit, a Facebook spokeswoman talked about how open source will make AI more effective.

However, as a general rule, Facebook open sources its hardware designs because it doesn't see hardware as strategic. Its service is strategic; the hardware is the means to deliver that end. Facebook believes the hardware advances available from open source collaboration outweigh the benefits of keeping technology secret and proprietary. It's a classic open source business strategy -- use open source wherever you can, and save proprietary technology only for those cases that truly deliver competitive advantage. (See AT&T Describes Next Steps for Network Virtualization and The Business Case for Open Source.)

With the case of AI, arguably the hardware and algorithms don't deliver competitive advantage; it's the data collected from user behavior and content that does that. And Facebook and Google aren't giving the data away away.

Related posts:

— Mitch Wagner, Circle me on Google+ Follow me on TwitterVisit my LinkedIn profileFollow me on Facebook, West Coast Bureau Chief, Light Reading. Got a tip about SDN or NFV? Send it to [email protected].

About the Author(s)

Mitch Wagner

Executive Editor, Light Reading

San Diego-based Mitch Wagner is many things. As well as being "our guy" on the West Coast (of the US, not Scotland, or anywhere else with indifferent meteorological conditions), he's a husband (to his wife), dissatisfied Democrat, American (so he could be President some day), nonobservant Jew, and science fiction fan. Not necessarily in that order.

He's also one half of a special duo, along with Minnie, who is the co-habitor of the West Coast Bureau and Light Reading's primary chewer of sticks, though she is not the only one on the team who regularly munches on bark.

Wagner, whose previous positions include Editor-in-Chief at Internet Evolution and Executive Editor at InformationWeek, will be responsible for tracking and reporting on developments in Silicon Valley and other US West Coast hotspots of communications technology innovation.

Beats: Software-defined networking (SDN), network functions virtualization (NFV), IP networking, and colored foods (such as 'green rice').

Subscribe and receive the latest news from the industry.
Join 62,000+ members. Yes it's completely free.

You May Also Like