Facebook Puts Photos in Cold Storage

Social network built custom-designed open storage hardware to accommodate photos its users shared a long time ago.

Mitch Wagner, Executive Editor, Light Reading

May 7, 2015

3 Min Read
Facebook Puts Photos in Cold Storage

Facebook is using open hardware to redesign two of its data centers to accommodate "cold storage" for old photos shared by users. The company described its plans in a blog post Thursday.

Facebook users share 2 billion photos daily, but not every photo is equally urgent. Your child's birthday party photos from this morning will get passed around and cooed over all week. Five years from now, it's likely nobody will look at those photos at all. And yet the photos will need to remain online and available instantly in case someone does want to look at them.

"Finding a place for these images to live so they can be instantly available is a recurring scale challenge for our infrastructure team," the company said in a blog post Thursday, written by infrastructure engineer Krish Bandaru and software engineer Kestutis Patiejunas.

Figure 1: On Ice A tray of hard drives in a cold storage rack. Source: Facebook. A tray of hard drives in a cold storage rack. Source: Facebook.

Facebook needed to make sure old photos were "just as accessible as the latest popular cat meme but took up less storage space and used less power," the post says.

"Instead of trying to utilize an existing solution -- like massive tape libraries -- to fit our use case, we challenged ourselves to revisit the entire stack top to bottom," the blog post says.

Facebook decided to redesign the data center building, and the hardware and software within it.

"The result was a new storage-based data center literally built from the ground up, with servers that power on as needed, managed by intelligent software that constantly verifies and rebalances data to optimize durability. Two of these 'cold storage' facilities have opened within the past year, as part of our data centers in Prineville, Oregon, and Forest City, North Carolina," Facebook says.

One major design goal was to reduce operating power, while maximizing floor space. "The data centers are equipped with less than one-sixth of the power available to our traditional data centers, and, when fully loaded, can support up to one Exabyte (1,000 PB) per data hall," Facebook says. Facebook removed redundant electrical systems, including uninterruptible power supplies and generators.

Hardware is based on the Open Vault specification from the Open Compute Project, modified to reduce power consumption, cooling and power supplies.

Want to know more about data center infrastructure? This will be just one of the many topics covered at Light Reading's second Big Telecom Event on June 9-10 in Chicago. Get yourself registered today or get left behind!

Some of the challenges were surprises. For example, the denser racks weighed more than 1,100 kg, which crushed their rubber wheels and made them immobile, Facebook says.

While Facebook offers a unique service, communications service providers share the social network's needs to balance live data with cold storage. CSPs serving enterprise customers need to provide access to data that the business needs immediately, while using cold storage for business records retained for archiving, legal discovery and regulatory purposes. Like those old family photos, business records in cold storage will probably never be accessed again -- but they might be, and if they are they need to be accessible right away.

Cold storage isn't Facebook's first foray into open hardware design. Facebook was a founder and is taking a lead role in the Open Compute Project, developing open source designs for data center compute, storage and networking infrastructure. (See Facebook Releases Data Center Tech and Facebook in Production Testing of Open 'Wedge' Switch.)

— Mitch Wagner, Circle me on Google+ Follow me on TwitterVisit my LinkedIn profileFollow me on Facebook, West Coast Bureau Chief, Light Reading. Got a tip about SDN or NFV? Send it to [email protected].

About the Author(s)

Mitch Wagner

Executive Editor, Light Reading

San Diego-based Mitch Wagner is many things. As well as being "our guy" on the West Coast (of the US, not Scotland, or anywhere else with indifferent meteorological conditions), he's a husband (to his wife), dissatisfied Democrat, American (so he could be President some day), nonobservant Jew, and science fiction fan. Not necessarily in that order.

He's also one half of a special duo, along with Minnie, who is the co-habitor of the West Coast Bureau and Light Reading's primary chewer of sticks, though she is not the only one on the team who regularly munches on bark.

Wagner, whose previous positions include Editor-in-Chief at Internet Evolution and Executive Editor at InformationWeek, will be responsible for tracking and reporting on developments in Silicon Valley and other US West Coast hotspots of communications technology innovation.

Beats: Software-defined networking (SDN), network functions virtualization (NFV), IP networking, and colored foods (such as 'green rice').

Subscribe and receive the latest news from the industry.
Join 62,000+ members. Yes it's completely free.

You May Also Like