Service Provider Cloud

OpenStack Goes Inside Atoms, Across Galaxy

BARCELONA -- OpenStack Summit 2016 -- OpenStack is going from the inifinitesimal to the infinite. Researchers are using the cloud platform to look inside atoms today, while the builders of an innovative telescope array hope to use the software to peer across the galaxy and back 400 million years in time.

The two projects are very different from each other. What they have in common is that they're generating prodigious amounts of data, and looking to OpenStack to manage it all.

CERN, the European Organization for Nuclear Research, is using OpenStack to manage data for several of its key experiments. These include the Large Hadron Collider, which at 27 kilometers around is "the largest machine on Earth," said Tim Bell, compute and monitoring group leader at CERN, during a keynote at OpenStack Summit Tuesday. The collider fires beams of photons at each other, measuring the results to determine the properties of subatomic particles. One key component for detecting collisions is a machine called the Compact Muon Solenoid. "It's a very strange term, given that it weighs 14,000 tonnes, to call it compact," Bell said.

CERN's computing infrastructure has to be able to handle 1 billion collisions per second. That demand is driving the need for OpenStack, Bell said.

And that's not the only experiment CERN is running. "I have the honor of having an antimatter factory just down the road from my office," Bell said. The apparatus assembles positrons, antiprotons and neutrons to make anti-hydrogen, which scientists experiment on to determine antimatter properties, such as whether antimatter rises in gravity.

All that science drives the need for a lot of compute. CERN stores 160 petabytes of data on tape, including 0.5 PB per day between June and August of this year. The organization anticipates a 60x compute increase by 2023, but the budget outlook for servers and people is flat, Bell said.

OpenStack helps CERN keep up with demand. CERN is using OpenStack on more than 190,000 cores in production, with more than 90% of CERN compute resources virtualized, 5,000 virtual machines migrated from old hardware in 2016, and more than 100,000 cores to be added in the next six months.

After Bell described how OpenStack is exploring the infinitesimal, the University of Cambridge's Dr. Rosie Bolton talked about OpenStack in the infinite. Or near-infinite, at any rate.

To Boldly Go
Dr. Rosie Bolton, University of Cambridge
Dr. Rosie Bolton, University of Cambridge

Bolton is part of a consortium building the Square Kilometer Array, a vast radio telescope due to go online in 2023. One part of the SKA will be located in the Western Australian desert, with 130,000 individual antennas in 512 clusters, over an 80-kilometer spread. The other part of the SKA will be in the Karoo desert in South Africa, with 197 antennas over 150 kilometers. The antennas will send data to Science Data Processors about 500 kilometers away from their separate antennas -- Perth, Australia and Cape Town, South Africa -- which then distribute the information around the world.

The antennas will be used to pick up signals going 400 billion years back in time, to observe the formation of the first stars. Separately, the SKA will observe several dozen pulsar stars spread around the galaxy. Pulsars send out pulses of radio activity with extremely precise regularity; by observing changes in the radio activity, astronomers hope to be able to detect gravity waves that span the galaxy.

Are you a service provider executive who wants to learn more about the impact of web-scale competition on the communications sector? Join us for Light Reading's third-annual 2020 Vision Executive Summit taking place in Rome, December 6-8. Contact our events team to find out if you qualify for a VIP pass.

The compute needs for the SKA will be enormous. Computers will need to ingest 400 gigabytes per second, generate and destroy 1.3 zettabytes of data and then preserve and ship 1 petabyte per day of science data, Bolton said.

The SKA consortium will build the compute facility toward the end of the first phase of construction of the telescope arrays, which is due in 2023. Bolton said she hopes to pique the OpenStack community's interest now, so OpenStack becomes a suitable platform for that kind of science when the SKA is ready to build its compute center. "It's a long way off, but if we start now we hope to get the OpenStack community interested," Bolton said.

As part of its criteria, the SKA is looking to make the compute facility futureproof. It plans to have the telescope arrays online for 50 years, and needs a platform that can mature over that time and not need to be completely replaced.

— Mitch Wagner, Follow me on TwitterVisit my LinkedIn profile, Editor, Light Reading Enterprise Cloud

pmahajan 10/26/2016 | 1:51:10 PM
400 billion years? > The antennas will be used to pick up signals going 400 billion years back in time, to observe the formation of the first stars.

Ah, we've broken the time-space barrier predating the Big Bang. The article does say 400 million years at the beginning.

jbtombes 10/26/2016 | 10:49:28 AM
Re: Fantastic! Fantastique, en effet! I once wrote an article on the optical transport requirements of the Large Hadron Collider, which happens to traddle the Swiss-French border in numerous places, but wonder whether it represents a (really large) corner case or outlier - or whether it stands closer to the trendline of the mainstream compute future.
Mitch Wagner 10/26/2016 | 8:33:40 AM
Re: Fantastic! At this morning's keynote, Jonathan Bryce, OpenStack Foundation executive director, pointed out that the Square Kilometer Array is "basically a distributed software-defined telescope." The compute and software are integral parts of the telescope itself.

I'm not actually sure what that means, but it sounds great.

Bolton noted during her presentation that the way radio astronomy historically works is that astronomers book time on the telescope, get their observations, and then receive data on tape (historically) or via ftp. The SKA's integrated compute will enable data to go out much more responsively. 
Mitch Wagner 10/26/2016 | 3:02:47 AM
Fantastic! I loved loved these presentations. So fantastic! Answering the fundamental questions of the universe. 

And it's amazing to consider building a computing architecture today that will still be usable in 50 years. The hardware and software would all be different (obviously) but it would be descended from the work done today, rather than rip-and-replace. 

Imagine someone trying to do that in 1966 with a compute platform that would still be in use, and current, today. 
Sign In