You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This basically exposes whether the hand tracking is tracking all joints, only some joints (potentially due to occlusion), or is just "guessing" joint locations based on button state.
We could potentially allow vendors to expose simulated hands if they're duly identified in the API.
The text was updated successfully, but these errors were encountered:
As pointed out by @toji we may want to also expose simulated hands without requiring user consent.
It might be worth making this a feature level distinction, you can either request full hand tracking (and have it denied when unavailable), or you can request simulated hand tracking (which doesn't require consent, but may not be supported). Requesting both will give you the most powerful one allowed by consent prompts. Applications that only need derived hand tracking can just request simulated hand tracking.
In OpenXR you need to specify how faithful you want the hand tracking to be before you create the hand tracker, and once you've created it you see what level was actually picked. Our permissions API enables us to do this already.
This could either be a faithful-hand-tracking vs simulated-hand-tracking feature distinction, or alternatively a new non-string feature type {hand_tracking: "faithful"}
Microsoft's draft OpenXR extension has support for requesting and viewing the "feature level" for the hand tracker:
This basically exposes whether the hand tracking is tracking all joints, only some joints (potentially due to occlusion), or is just "guessing" joint locations based on button state.
We could potentially allow vendors to expose simulated hands if they're duly identified in the API.
The text was updated successfully, but these errors were encountered: