Some of the nation's largest MSOs, including Comcast Corp. and Time Warner Cable Inc., are developing video-optimized content distribution networks (CDNs) that will allow them to vastly expand their on-demand libraries, help fuel their future TV Everywhere plans, and grapple with Netflix Inc. and other over-the-top video aggregators.
Using a hierarchical architecture borrowed from the way the traditional Internet operates, these video CDNs will use high-speed fiber links to connect a massive, central server library to regional and local edge cache servers that will stream out the most popular content. As the concept goes, a key benefit of keeping popular titles closer to subscribers is that it cuts down transport costs.
Although these CDNs steal a page from the Internet playbook, they must also address a new set of technical and operational challenges because they are being used to send sensitive (and much larger) video files. And getting this work to scale will require a set of new, standard, and perhaps quasi-standard, interfaces (more on all this later).
"There are some less than subtle differences" between traditional Internet CDNs and newer architectures that are being tailored for video, says John Schanz, executive VP of national engineering and technical operations at Comcast, who joined the MSO in 2006 after serving as EVP of network and data center operations for America Online.
"If you look at some of the CDN technologies available today on the Internet, they're optimized and built around millions of objects that are smaller in size and duration," says Schanz. "For us to take that technology into the cable infrastructure and [use it] to deliver tens of thousands of movies, the size of the object is much bigger. So you actually take similar technologies, but optimize around what you do to deliver on-demand movies."
Another key driver for this work centers on the development of massive libraries that don't have to be replicated in every MSO region. Having to recreate the video-on-demand (VoD) storage and streaming wheel in every hub and headend would not only be inefficient but also super expensive.
"The initial goal [of the video CDN] is sheer economics to ingest and distribute that content efficiently," says John Wheeler, director of marketing and business development for video and content platforms at Cisco Systems Inc., a company that boosted its video server and streaming capabilities about four years ago via its purchase of Arroyo Video Solutions Inc. (See Cisco Snatches VOD Vendor Arroyo.)
Plus, if all that content is replicated and stored at all the individual system "islands" (Comcast alone has more than 100 of them), operators will also have to worry that all the VoD assets are being pitched and caught correctly and arrive at their final destinations free of faults and artifacts.
"It just becomes an operational nightmare," says Tom Rosenstein, VP of product marketing at Verivue Inc., a startup that has developed a line of Flash-based video switches with big CDN aspirations. (See Verivue Tech Gets a Toehold at Shaw , Verivue Surfaces With Comcast Backing , and Arris Pumps Up Video With Dolce's Verivue .)
And if only 30 percent of a library that's being replicated over and over again is actually being viewed over a period of time, the operator could end up spending on unnecessary infrastructure and operational overhead to support it, Rosenstein notes.
One million titles?
But how much content will these CDNs of the (near) future support? Estimates vary, but the numbers cited tend to be pretty high, since they will need to keep pace with Netflix, whose "Watch Instantly" vault already has tens of thousands of titles prepped for streaming.
Some of those future plans involve about 100,000 titles, but "we're talking about 300,000 titles with some folks," Wheeler says.
SeaChange International Inc. SVP of software engineering Steve Davi cited similar figures, but he says the infrastructure that's being put in place has some operators talking about more than 1 million titles. (See SeaChange Unveils Cable CDN Lineup.)
That will lead to a paradox of choice and present a significant challenge on how consumers will even start to sift through that massive sea of content to find what they want without making them go into the fetal position and whimper softly. But there appears to be no real limit on the number of titles the video CDN of the future will hold.
And the idea of offering almost unlimited choices tracks back to Comcast's "Project Infinity" initiative, which the MSO introduced at the 2008 Consumer Electronics Show, where it originally identified 1000 high-definition (HD) choices as one target for the end of that year. (See Comcast Launches 'Project Infinity'.)
But the name Comcast selected for the initiative, by design, identifies no clear end-game on how many titles could be offered. However, in May, Comcast announced it had vastly expanded its vault of VoD choices, suggesting that the MSO had started to stretch the legs of a production-ready video CDN.
Schanz tells Light Reading Cable that the MSO's CDN is up and running at scale in its Philadelphia "Freedom Region" with a "significant number of customers" tapped into a platform that offers about 17,000 on-demand choices. Comcast is now in the process of getting its Washington, D.C., system CDN-ready. [Ed. note: We'll be posting a more extensive piece on the status of Comcast's CDN project on Friday.]
TV here, there, everywhere
A side benefit of a national CDN is that it offers a path to TV Everywhere and the technical ability to deliver video, not just to TVs, but to PCs and mobile devices.
Some of that can be done by retrofitting older servers or installing new ones that are optimized for multi-screen delivery. Among examples, Concurrent Computer Corp. says its new CDN-optimized MediaHawk VX server can send video to any screen off the same storage, because it supports video apps from all comers, including Adobe Systems Inc., Apple Inc., and Microsoft Corp..
But a new software release allows its older servers, including the MediaHawk 4500, to be upgraded to the VX platform and support CDNs, says Jim Brickmeier, Concurrent's VP and GM of video solutions.
Earlier on, however, MSOs will likely do the transcoding of those different formats beforehand rather than on the fly off of a pristine mezzanine file, SeaChange's Davi says. That approach will jack up storage requirements on the CDN until new do-it-all boxes -- such as one being touted by RGB Networks Inc. -- hit the scene and show they can handle the heavier processing load required by real-time transcoding. (See RGB's TV Everywhere Offer: A Video God Box .)
The video CDN concept looks terrific on paper, but will it work as advertised and achieve scale?
One challenge of this is to automate the movement of all that content around on the CDN based on popularity, something that companies like Cisco, SeaChange, Concurrent, and Motorola Inc. are all trying to address.
They're all working on predictive and dynamic ways of doing that, but the mere idea of distributing some of the central library to the regional and edge servers makes financial sense. "Twenty percent to the edge is more economical than 100 percent to the edge," says Cisco's Wheeler.
Another challenge is latency -- something that regular Web pages can live with to a degree, but requires much tighter controls when dealing with sensitive video packets. Maintaining QoS for VoD at the local and division level is challenging enough, but it's amplified when an operator tries to manage and coordinate that across a national framework, according to Brickmeier.
Also of importance is the back office that will operate these CDNs. Traditionally, cable operators have used back-office systems to manage VoD in each market. There's a drive now toward a system that's built on a core set of standards that defines how set-tops talk to the back office and how the back office sets up the streams and manages the resources.
Comcast is said to be putting its stake in the ground by developing CDN extensions for its Next Generation On Demand (NGOD) interfaces that will allow the back office to hook into servers and streamers from multiple vendors and offer supplier options for the central storage component. NGOD's CDN components also specify how the edge servers handshake with the central library. There are similar and competing specifications, however, that may not match up with NGOD at every technical layer. For example, Time Warner Cable's Interactive Services Architecture (ISA) uses different interfaces.
Others, however, may not want to lean on quasi-standards developed by cable operators, and that may lead to more official standards from organizations outside of cable, like the Alliance for Telecommunications Industry Solutions (ATIS), or cause an org like CableLabs to take this on by combining the hard work of the MSOs with contributions from other parties.
Given those questions, some operators may prefer to stay on the video CDN sidelines for a bit longer and let Comcast and TWC handle the early heavy lifting. "Some may be hedging their bets until it appears that the hard problems are solved in a standardized way," says one cable industry insider.
But even with standards in place, operators will still have to time out how they migrate to a CDN model, and the value they can squeeze out of making that move. Some may need to retrofit their existing servers -- or buy new and calculate how quickly they can recoup those investments.
"For many operators, the deployments aren't homogeneous, so the operational considerations are pretty significant," Verivue's Rosenstein says. "That's been primarily the biggest issue with regard to why CDNs aren't deployed everywhere right now."
And the approach may not suit all cable operators. Smaller operators tend to have smaller libraries and fewer "islands" to interconnect. But, if the majors decide to stick with common interfaces, that could open the door for them to reach a deal to feed off of Comcast's CDN or go with another centralized aggregator.
Davi says it's possible that the studios themselves might end up becoming second-tier CDNs by building their own storage and streaming capabilities. They could either use those systems to deliver content to pay-TV partners or opt to go directly to consumers and ply orders by releasing content in more attractive (i.e., earlier) windows. They could also go in that direction to create internal content libraries that offer more of their content than what's available via cable operators and telco TV operators.
Davi acknowledges that this is a "futuristic concept," but one that would give studios more control of their content and how it's distributed.
MSO CDNs: How soon?
Among cablers, Comcast appears to be the furthest ahead (we've asked TWC to provide an update on its progress), but most agree that the general trend is still in its early days.
"It's starting to happen, but it's not in mass scale," Rosenstein says. "The infrastructure planning is underway."
Brickmeier sees "initial trial activities" occurring during the balance of this year and into the early part of 2011, with some scaled deployments starting up by the later part of next year.
"You'll see some commercial activity in 2010, at least some of the initial phases of it," says Davi, whose company already supplies Comcast with streamers, memory, back-office software, and centralized storage. But he expects "more wide adoption [of CDNs] in 2011 across multiple MSOs."
â€” Jeff Baumgartner, Site Editor, Light Reading Cable