Meta Releases ‘First Hand’ Demo to Showcase Quest Hand-tracking to Developers

Source Node: 1609386
image

Meta this week released a new demo app called First Hand to showcase the kind of experiences that developers can build with the company’s controllerless hand-tracking tools.

Controllerless hand-tracking has been available on Quest for years at this point, and while it’s a more accessible input modality than using controllers, controllers are still the main form of input for the vast majority of games and apps on the headset.

Meta has been increasingly pushing for developers to embrace hand-tracking as more than a novelty, and to that end has been building tools to make it easier for developers to take advantage of the feature. But what’s better than a good hands-on example?

This week Meta released a new demo exclusively built around hand-tracking called First Hand (named in reference to an early Oculus demo app called First Contact). Although the demo is largely designed to showcase hand-tracking capabilities to developers, First Hand is available for anyone to download for free from App Lab.

[embedded content]

Over at the Oculus developer blog, the team behind the app explains that it was built with the ‘Interaction SDK’ which is part of the company’s ‘Presence Platform‘, a suite of tools made to help developers harness the mixed reality and hand-tracking capabilities of Quest. First Hand is also released as an open source project, giving developers a way to look under the hood and borrow code and ideas for building their own hand-tracking apps.

The development team explained some of the thinking behind the app’s design:

First Hand showcases some of the Hands interactions that we’ve found to be the most magical, robust, and easy to learn but that are also applicable to many categories of content. Notably, we rely heavily on direct interactions. With the advanced direct touch heuristics that come out of the box with Interaction SDK (like touch limiting, which prevents your finger from accidentally traversing buttons), interacting with 2D UIs and buttons in VR feels really natural.

We also showcase several of the grab techniques offered by the SDK. There’s something visceral about directly interacting with the virtual world with your hands, but we’ve found that these interactions also need careful tuning to really work. In the app, you can experiment by interacting with a variety of object classes (small, large, constrained, two-handed) and even crush a rock by squeezing it hard enough.

The team also shared 10 tips for developers looking to make use of the Interaction SDK in their Quest apps, check them out at the developer’s post.

Time Stamp:

More from Road to VR