Oculus Expanding Quest Mixed Reality Capabilities With Enhanced Developer Tools – Road to VR

The Entire VR Industry in One Little Email

The Daily Roundup is our comprehensive coverage of the VR industry wrapped up into one daily email, delivered directly to your inbox. 

Oculus plans to further open up the mixed reality capabilities of Quest with new tools that will allow developers to build apps which more intelligently integrate with the user’s real room. In the near future developers will also be permitted to distribute mixed reality apps to customers via the Quest store or Oculus App Lab for the first time.

Oculus first began unlocking Quest’s mixed reality capabilities with the Passthrough API which allowed developers to tap into the headset’s pass-through video view for the first time earlier this year. Now the company is announcing a more advanced set of tools, which it calls the Presence Platform, which will allow developers to build more advanced mixed reality applications.

The Presence Platform includes the Insight SDK, Interaction SDK, and Voice SDK.

Insight SDK

The main building block of the insight SDK is the Passthrough feature, which developers previously had access to in an experimental form. That feature is moving out of its experimental form and into general availability starting with the next developer update.

Additionally, the Insight SDK includes Spatial Anchors which gives developers the ability to place virtual objects in the scene and allow them to persist between sessions. For instance, a piano learning app could allow you mark the location of your piano, and the app could then remember where the piano is any time you open it.

The Insight SDK further includes Scene Understanding, which Oculus says allows developers to build “scene-aware experiences that have rich interactions with the user’s environment.” This includes geometric and semantic representation of the user’s space, meaning developers can see the shape of the room and get a useful idea of what’s in it. For instance, the Scene Understanding feature will allow developers to know what parts of the scene are walls, ceilings, floors, furniture, etc all of which can be used as a surface on which virtual content can be naturally placed.

Oculus says the developer will see a “single, comprehensive, up-to-date representation of the physical world that is indexable and queryable.” You can think of this like the headset building a map of the space around you that developers can use as a guide upon which to build a virtual experience that understand your physical space.

However, users will need to do some work on their end in order to generate this map for apps that need it, including marking their walls and tracing over their furniture.

Crucially Oculus says that the Insight SDK will enable developers to build feature-rich mixed reality apps “without needing access to the raw images or videos from your Quest sensors.” We’ve reached out to the company to further clarify if Oculus itself will send the raw sensor footage off of the headset for any processing, or if it will all happen on-device.

The Scene Understanding portion of the Insight SDK will launch in an experimental form early next year, according to the company.

Interaction SDK

Another part of the Presence Platform is the Interaction SDK which will give Unity developers a ready-made set of simple interactions for hands & controllers, like poking buttons, grabbing objects, targeting, and selecting. This saves developers time in building their own versions of these commonly used interactions in their apps.

Oculus says the goal of the Interaction SDK is to “offer standardized interaction patterns, and prevent regressions [in tracking performance of specific interactions] as the technology evolves,” and further says that the system will make it easier for developers to build their own interactions and gestures.

The company says that the Interaction SDK (and the previously announced Tracked Keyboard SDK) will become available early next year.

Voice SDK

The Voice SDK portion of the Presence Platform will open up voice-control to Quest developers, which Oculus says can drive both simple navigation functions (like quickly launching your favorite Beat Saber song with your voice) and gameplay (like casting a voice-activated spell).

The system is based on Facebook’s Wit.ai natural language platform which is free to use. Oculus says the Voice SDK will arrive in an experimental form in the next developer release.

Mixed Reality Apps on the Quest Store and App Lab

While not all of the Presence Platform SDKs will arrive at the same time, as of the next Quest developer release, devs will be allowed to ship mixed reality apps via the Quest store or App Lab. That release is expected next month.

The World Beyond Sample App

Early next year Oculus says it will make available a sample project called The World Beyond which developers can use as a starting point for building atop the Presence Platform features. The app will also be made available to users.

Source: Read Full Article