Reducing latency for West Coast cloud customers and making it easier for apps to understand plain spoken English.

Mitch Wagner, Executive Editor, Light Reading

July 20, 2016

4 Min Read
Google Launches West Coast Cloud Region, Natural Language Tools

Google is taking steps to reduce latency for its West Coast cloud customers, as well as launching APIs to allow apps to recognize speech and natural language.

Google said in a blog Wednesday that it is opening an Oregon Cloud Region, with support for Google Compute Engine, Google Cloud Storage and Google Container Engine. Users in Vancouver, Seattle, Portland, San Francisco, Los Angeles and other locations can expect to see a 30% to 80% reduction in latency for applications served from the new region, compared with the same applications served from its central US region, Google says. The company plans to bring a Tokyo region on later this year and ten additional regions in 2017, as disclosed in March. (See Google: 'Dead Serious' About Enterprise Cloud.)

Google, as well as Microsoft Corp. (Nasdaq: MSFT) and IBM Corp. (NYSE: IBM), are chasing after Amazon Web Services Inc. , which has an overwhelming lead in cloud market share. Latency is a vulnerability for Amazon; a recent analysis by Light Reading and Cedexis based on measurements collected by Cedexis showed AWS lagging in latency behind Microsoft Azure and IBM. Google was not included in the test. (See By the Numbers: AWS' Latency Vulnerability.)

Another area where competitors like Google and Microsoft are going up against AWS is in going beyond basic infrastructure and platform tools like compute, storage, network and database. Amazon is competing in that race too, with its own analytics tools, for example. In that vein, Google launched Machine Learning products in beta, for Cloud Natural Language and Cloud Speech.

The Natural Language tools, in open beta, let developers reveal structure and meaning of text in multiple languages, initially English, Spanish and Japanese. Capabilities include sentiment analysis; entity recognition such as people, organizations, location, events, products and media; and syntax analysis to identify parts of speech and create dependency parse trees for sentences to reveal structure and meaning of text.

Google says digital marketers could use the tools to analyze online product reviews and service centers can go to work on transcribed customer calls. It cited British online marketplace Ocado Technology as an alpha customer.

Cloud Speech API, also in beta today, offers speech-to-text conversions in more than 80 languages for apps and IoT devices, using the same technology as Google Search and Google Now.

Google's shout-out to its own Search and Now services are part of an overall strategy for Google to gain competitive advantage in the cloud. Google is relatively late to the enterprise cloud, and a big part of its pitch is that it can give enterprises access to the same tools, stability, expertise and scale that power Google's own widely used consumer services. Amazon used a similar pitch when it launched AWS a decade ago -- power your app with the same platform that runs the world's largest online retailer.

Want to know more about the cloud? Visit Light Reading Enterprise Cloud.

Natural language recognition is an example of one of the great benefits of the cloud: The cloud gives enterprises the ability to more easily experiment with advanced technology without a lot of time and hassle. In the era when on-premises computing was the only option, a company wanting to experiment with advanced technology had to dedicate hardware to the job, license and install the application and train IT staff in keeping the app up and running, while software developers wrote code to take advantage of the new capabilities. Then, if the enterprise found the service useful, it had to find resources to scale it up.

In the cloud, there's no software to install and no need for servers to run the software on or IT staff to keep it running. All of that is handled by the cloud provider; the enterprise can focus on just trying the service out. And if the experimental service proves useful, the enterprise just signs up with the cloud provider for greater resources, with no need to find on-premises hardware and real estate to run the additional applications.

For example, Jabil used capabilities from Microsoft Azure to experiment with manufacturing analytics. (See Jabil Leverages Cloud to Improve Manufacturing .)

— Mitch Wagner, Follow me on TwitterVisit my LinkedIn profile, Editor, Light Reading Enterprise Cloud

About the Author(s)

Mitch Wagner

Executive Editor, Light Reading

San Diego-based Mitch Wagner is many things. As well as being "our guy" on the West Coast (of the US, not Scotland, or anywhere else with indifferent meteorological conditions), he's a husband (to his wife), dissatisfied Democrat, American (so he could be President some day), nonobservant Jew, and science fiction fan. Not necessarily in that order.

He's also one half of a special duo, along with Minnie, who is the co-habitor of the West Coast Bureau and Light Reading's primary chewer of sticks, though she is not the only one on the team who regularly munches on bark.

Wagner, whose previous positions include Editor-in-Chief at Internet Evolution and Executive Editor at InformationWeek, will be responsible for tracking and reporting on developments in Silicon Valley and other US West Coast hotspots of communications technology innovation.

Beats: Software-defined networking (SDN), network functions virtualization (NFV), IP networking, and colored foods (such as 'green rice').

Subscribe and receive the latest news from the industry.
Join 62,000+ members. Yes it's completely free.

You May Also Like