Intel is giving researchers and scientists access to free compute power through the cloud to help the development of artificial intelligence, machine learning and other technologies.
At the O'Reilly Artificial Intelligence Conference in San Francisco this week, the chip maker unveiled the Nervana DevCloud, which will give about 200,000 researchers access to hardware and software platforms to help further research into a host of new technologies.
Since Intel is sponsoring the Nervana DevCloud, the company plans to use its own Xeon processors as the hardware underlying these cloud services. The cloud service will open up in October, and the company is now accepting applications.
In addition to Nervana, Intel and Tata Consultancy Services announced on September 19 that the two companies would create the Artificial Intelligence Center of Excellence (CoE) to further develop AI technologies, as well as give space to academics and startups to help build real-world products.
Intel, which has looked to shift its focus away from traditional IT hardware to more cutting-edge technologies such as machine learning, AI, connected cars and cloud computing, also announced this week that it has invested about $1 billion in AI startups and research and development projects. (See Intel, Mobileye $15.3B deal has cloud under the hood.)
That money is coming from Intel Capital, the company's investment arm.
"I believe Intel will be the AI platform of choice, offering unmatched reliability, performance, security and integration. We are 100 percent committed to creating the roadmap of optimized products to support emerging mainstream AI workloads," Intel CEO Brian Krzanich wrote in a blog post this week.
In a way, Intel is following the lead of several other large tech firms that are looking to not only invest in AI and machine learning research, but offer data scientists, researchers and academics resources to investigate what is possible with these technologies.
For example, earlier this year Google announced that it is making 1,000 of its Tensor Processing Unit (TPU) chips available to researchers through the cloud. The one catch is that any AI discoveries need to be published in an academic journal or made available through open source. (See Google's TPU Chips Beef Up Machine Learning.)
Not to be outdone, Microsoft has also opened its own AI lab in the UK. (See Microsoft Establishes New AI Research Lab.)
At the same time, Intel is trying to keep up with the likes of other chip makers, especially Nvidia, which has been pushing its graphics processing units (GPUs) as the true driver of AI development. The company's efforts have made it a leader in artificial intelligence, according to at least one analyst.
- John Deere Is a Machine Learning Company Now
- Microsoft's 'Project Brainwave' Details Ambitious AI Plans
- Apple Launches Machine Learning Journal