Intel CEO Brian Krzanich opened the Intel Developer Forum '16 (IDF) with two announcements meant to quickly promote virtual reality (VR) from the curiosity it is now to a very common tool in just a few months. He unveiled Intel Corp. (Nasdaq: INTC)'s Project Alloy, which relies on an Intel-based headset that merges virtual and physical reality, and a collaboration with Microsoft which will see every Windows 10 platform upgraded early next year with software that will enable every single one of those PCs to support VR applications.
Krzanich also promised more developer kits, more APIs and more open-source code to make it easier for developers to use a wide array of Intel products, including the RealSense camera module, designed for any application that requires vision, and its Euclid sensor module, a pumped-up extension of the Realsense module was designed for robotics applications.
Project Alloy is now doing something new that the VR community has been striving to achieve: get representation of "real" objects into a VR environment. This is distinct from augmented reality (AR), which typically involves a virtual overlay over the physical environment -- think of seeing a virtual Snorlax sleeping on an actual park bench when you're playing Pokémon Go. AR adds something virtual to physical reality.
On stage at IDF today, Krzanich introduced a demonstrator who he identified only as Craig. As Craig donned a Project Alloy VR headset, his view was recreated on the giant screen on stage. When Craig raised his hands, video representations of his hands -- mildly pixelated -- appeared in his VR environment. He used a virtual finger to carve an item on a virtual lathe, and then pulled a physical dollar bill out of his pocket, and used the video representation of the dollar that the headset imported into the VR environment to carve some more. Blind in the real world with his headset on, Craig shuffled perilously close to Krzanich, and as he did so, a video representation of Krzanich appeared in the virtual environment, trying to wave him off, just as Krzanich was actually doing on stage.
("If you don't want to injure the CEO of a Fortune 500 company, this is the system for you," Craig quipped. )
Krzanich promised that in the second half of 2017, Intel will open up the Alloy hardware and provide APIs, making it available to allow developers to create their own applications. "Anyone can take Alloy and a headset and create any application they choose," he said.
Krzanich then introduced Terry Myerson, EVP of Microsoft's Windows and Devices Group, who made the announcement about the upgrade of Windows 10. "Next year, we'll release an update to all Windows 10 PCs that will include a holographic shell. Those PCs will be able to run Windows Hololens and interact with both 3D and 2D applications," Myerson said.
The Windows 10 update follows from Microsoft's announcement last January of its Hololens project, which at the time included access to a VR headset and a software development kit for creating applications for the headset.
Myerson said Microsoft Corp. (Nasdaq: MSFT) is working on common holographic version 1.0 specification, which it intends to release at the Windows Hardware Engineering Conference (WinHEC) in December.
here on Light Reading.
In May, Intel bought Replay Technologies, a company that creates 3D representations of events, most notably basketball games. Krzanich did not mention Replay by name, but touted the capability of being able to record events and then enable users to view the action from quite literally anywhere within the represented space. The demo at IDF was of Game 7 of the NBA Finals between the Warriors and the Cavaliers; the point of view swung smoothly from several points in the stands to several points courtside. The system is capable of putting the viewer on the floor. The demo shows how it is possible to not just provide a 360-degree view from any point, but a 360-degree view from any point in a given volume.
That's not unprecedented, but it was an impressive demo, one that requires a tremendous amount of processing power, which Krzanich was happy to point out in order to tout the company's upcoming Broadwell microprocessor which incorporates ten processing cores, and which will be used for such applications. He said the device is now shipping to partners.
Last May, Intel and BMW said they were collaborating on autonomous cars. At IDF, Krzanich invited BMW Group Senior Vice President Elmar Frickenstein to talk about how the VR capabilities that Intel is helping to enable will be useful to that project.
Frickenstein explained that autonomous vehicles are going to need not just radar and lidar on-board, but will also require sensors to build a real-time 3D models of the environment, taking into account other vehicles, pedestrians, and other objects. They will require artificial intelligence and machine learning. All of those capabilities will require connectivity for real-time processing, he explained.
In another demo, Intel showed how game developers can use a VR headset when designing game environments with the Unreal Engine (a set of game development tools). Using the headset, game developers can simply point and click at object to insert, delete or change the properties (colors, textures) of virtual objects. Light rendering -- a tremendously compute-intensive process -- was changed in seconds.
"This is something game developers do 30 to 50 times a day. Game development could be reduced by weeks, maybe months," Krzanich said.
At CES last January, Intel showed a drone equipped with its Realsense camera. At IDF, Krzanich said Intel is releasing two new developer kits. Project Aero is an all-in-one board the developers can use to build their own drones. It supports LTE communication, and can be pre-ordered today for $399. The other is essentially a complete drone that developers can use to host applications. Krzanich said that is coming sometime during either the fourth quarter of 2016 or the first quarter of 2017.
Intel's Euclid is a version of Realsense that integrates computing, motion sensors and communications in addition to RealSense cameras. It's a plug-and-play module for robotics applications. The original is roughly the size of a candy bar. Krzanich showed the next version -- a thin card just a few inches in length.
— Brian Santo, Senior Editor, Components, T&M, Light Reading