LinkedIn is building out a private cloud as a foundation for diversifying into a full-service social network for business.
"We see LinkedIn transitioning to be the platform of choice for every professional in the world," Yuval Bachar, principal engineer, architecture and strategy, tells Light Reading. LinkedIn is looking to go beyond its current mainstay business as an extended job search and address book site, to allow professionals to do research, take classes, and more. "We're looking at this as the focal point of every professional in the world, white collar or blue collar," Bachar says. These include workers at every stage of education, from high school to college to advanced degree.
LinkedIn calls its vision the "Economic Graph," a digital map of the global economy, including a profile of every one of the 3 billion members of the global workforce, representing their professional identities, to help them find opportunities. The Economic Graph also includes profiles for every company in the world, with personal connections to help potential workers get their foot in the door, as well as a digital presence for every higher education institution in the world that can help build professional skills.
But that's the future. For now, some 60% of LinkedIn's revenue today comes from Talent Solutions, which helps recruiters search for candidates, and most of that revenue comes from corporate clients.
"We will have to enable features that are much richer in content than we have today," Bachar says.
To achieve those goals, LinkedIn needs to build out its data center infrastructure, enhancing automation for improved efficiency. The company describes the new LinkedIn Platform as a Service private cloud infrastructure in a blog post scheduled to go up Wednesday 1:00 p.m. EDT.
Find out more about cloud at our upcoming Big Communications Event in Austin, Texas, May 24-25. Register now!
"Over time we've built up great automation for a lot of low-level operations, tracking hosts we buy, updating software versions on individual hosts, and things like that," Steven Ihde, head of the LTS project and principal staff software engineer for LinkedIn, tells Light Reading. "What we're really trying to do here is take the automation to the next level, by having an API-driven system for developers and operations staff, automating a lot of things that are currently done manually with ticket-based tracking systems, such as selecting hosts to turn on new services, and expanding and contracting services in reaction to traffic and other demands."
The biggest shift is managing topology -- which services and jobs are running where in the data center -- automatically, a process previously managed manually by judgment call by the network operators. That process is now managed automatically by a system LinkedIn calls "Rain," using containerization to protect services from each other.
Next Page: 50% Savings
Topology automation results in a whopping 50% resource savings. "We can run the same number of services on half the hosts once we manage policy this way," Ihde says.
LinkedIn has been building LPS for the past two years. Rain is already in production, and additional elements will roll out later this year and in 2017.
Like other cloud providers, including Facebook and Google (Nasdaq: GOOG), LinkedIn chose to build its infrastructure internally rather than buy from vendors. (See Facebook Reinvents Data Center Networking and Google: 'Great' Data Center Networks Essential.)
"LinkedIn operates at a scale that is larger than most," Ihde said. "This put us in a position where many of the available commercial and open source solutions didn't work for us, or weren't designed to solve the problems we were trying to solve. That was really the primary motivation for building LPS." LinkedIn uses a mix of open source and home-built components.
Likewise, outsourcing to an external cloud provider -- as, for example, Netflix Inc. (Nasdaq: NFLX) has done with Amazon Web Services Inc. -- is also not an option. "We can operate this ourselves more efficiently than purchasing services from a public cloud provider," Ihde says. "We want to build our data centers as quickly as we can and we feel like we're the ones best placed to do that." (See Netflix Cloud Casts Long Shadow Over Cable .)
Security was key to the project -- specifically, preventing services from interfering with each other when sharing resources. "We were looking at it from the efficiency angle, but as we went on we realized that to make the things work and operational we needed to isolate those processes from each other as much as possible," Ihde says.
LinkedIn also found it challenging to optimize CPU resource utilization. "There is a tradeoff between making maximum use of idle capacity -- if there are CPUs cycling on a machine, why not offer them to any process that wants them? – versus consistency and predictability," Ihde says. "If the idle capacity varies over time, that impacts predictability. That's been a challenging tradeoff to work through."
Using standardized APIs helped integrate components into the overall architecture.
LinkedIn believes its experience is applicable to other companies that rely on software written internally for competitive advantage, whether in the Internet industry or some other vertical. "If you have sizable operations running your own software, this is where the approach makes the most sense," he said.
The LPS transition complements bringing online a new data center and data center architecture -- Project Altair.
"Project Altair represents for LinkedIn the transition from operating as a large enterprise that has a significant data center foot print to an environment more like a mega-data-center," LinkedIn's Bachar says.
Next Page: Stock Drop
LinkedIn is looking to increase its user base drastically by offering a wide variety of content and apps, including video and other applications requiring significant bandwidth. That's 5x or 10x growth. "Creating another level of smaller data centers would not have been very efficient for us," Bachar said.
The new data center isn't Facebook- or Google-scale, with hundreds of thousands of servers per data centers, but it's still pretty big: 100,000 to 200,000 servers.
LinkedIn currently has four of its smaller data centers in the US, and one in Singapore, with the new Project Altair data center going up in Hillsboro, Oregon. "We have the keys and we're running the tests on it. It's not live yet," he said. As the Project Altair center comes online, LinkedIn may retrofit the old data centers with new technology, expand them or shut them down.
Project Altair uses high-speed interconnects and high density racks. Each rack can hold 96 servers, connecting at 10 or 25 Gbit/s. The next level of the network is a 100 Gbit/s interface, split into 2x50, optimized for bandwidth and cost. All data centers are standardized on a single 1U switch.
Light Reading talked with Yuval about data center innovation and its work with the Facebook-sponsored Open Compute Project in a recent video interview. (See LinkedIn on OCP & Data Center Innovation.)
LinkedIn has been broadening its services for several years, for example, offering groups for discussion of professional interests, and published articles on its platform. But it's recently stepped up the pace, acquiring online educators Lynda.com for $1.5 billion last year.
The stakes for LinkedIn's diversifying have become higher. The stock price dropped more than 40% in early February after LinkedIn reported its fourth quarter earnings. LinkedIn traded at $113.70 after hours Tuesday, down from $265.35 almost exactly a year ago -- April 10 -- and $191.25 Feb. 3, just prior to reporting 4Q earnings.
The drop came after LinkedIn in early February reported a fourth-quarter loss and weaker-than-expected 2016. LinkedIn forecast revenue of $820 million in the first quarter of this year, missing analyst expectations of $867 million.
LinkedIn hit 414 million members in the fourth quarter, with a loss of $8.4 million, compared with $3 million profit in the year-ago quarter. Revenue was $861 million, up 34% and topping analyst expectations of $857.6 million.
— Mitch Wagner, , West Coast Bureau Chief, Light Reading.