The Design & Implementation of Oculus Quest Hand-tracking in Myst

Using Presence Platform’s upgraded Hand Tracking API, we introduced Hand Tracking with our most recent update to Myst on the Meta Quest Platform, titled ‘Hands & More’. We’re super excited to finally let folks play Myst on Quest without physical controllers! In this post, we’ll discuss the evolution and iteration of implementing hand tracking in Myst—and in particular, adding more support for it in Unreal Engine 4.27.2.

Guest Article by Hannah Gamiel

Hannah Gamiel is the Development Director at Cyan—the studio behind the original ‘Myst’ games—and helped develop the new ‘Myst (2020)‘ which includes VR support. Originally coming from a purely technical background, she now helps lead production on all titles and manages business & tech efforts at Cyan. She has worked on titles such as ‘Myst’ (2020), ‘The Witness’, ‘Braid, Anniversary Edition’, ‘Obduction’, ‘Firmament’ (coming soon!), and more.

Design Phase & Considerations

Designing Navigation for Hand Tracking

Picture indicating where you’d like to go. You likely thought of pointing, right? That’s why we opted to use a ‘pointing’ method for movement in Myst.

When you’re in teleport mode, you can point to where you’d like to go and the teleport ring appears at your destination. When you ‘un-point’ (by extending the rest of your fingers, or simply pulling your pointer finger back into your palm), the teleport is executed.

When you’re in smooth movement mode, pointing with your free-movement-dominant hand (which can be configured in our controls settings, but is the Left hand by default) will begin smoothly moving you around in the direction you’re pointing.

When playtesting movement with pointing, we found that hand tracking can sometimes be unreliable with your pointer finger and middle finger when it is occluded by the rest of your hand. The system isn’t sure whether those fingers are fully pointing or fully ‘enclosed’ in your hand. We added a bit of a ‘fudge’ factor to the code to account for more stable movement initiation/execution on this front—which we’ll go into a bit later when we discuss changes made to out-of-the-box Hand Tracking support in Unreal Engine.

Turning

The ‘point’ method doesn’t work for all navigation usages. When it comes to turning, we initially combined pointing with wrist rotation. Comparing the player’s wrist and the camera’s forward vector would indicate the direction of the turn (and how big the turn should be).We tried this initially since it seemed intuitive to keep the ‘pointing’ theme for navigation going between all modes.

SEE ALSO
‘Immerse’ Brings Live Spanish Courses to Quest 2, More Languages Coming in 2023

Complications arose in comfort tests, however. In playtesting, most players would point forward with their palm facing the ground, as one likely would when attempting to point at something outside of a game as well. Rotating your wrist to the left and right (around the up axis of your wrist) while you have your palm facing the ground is challenging and has a very limited range of motion, especially if trying to turn away from your chest.

This issue is the same even if you asked a player to point at something in front of them with their palms facing inward. You can bend your wrist in toward your body quite a bit, but you won’t get the same range of motion bending your wrist away from your body.

So how did we solve this? We ended up assigning turning to a ‘thumbs-up’ gesture instead of a pointer-finger-pointing gesture.

Imagine giving a thumbs-up. Now turn your wrist right and left. Note that even though you don’t have a huge range of motion it’s still fairly consistent pointing either ‘left’ and ‘right’ with your thumb in this gesture.

This is what we settled on for turning in hand tracking mode. Although pointing with your thumb doesn’t seem like the most intuitive way to turn, it did end up being the most comfortable and consistent way of doing so.

With snap turning, rotating your wrist to the left or right from a thumbs-up position causes a single snap turn to initiate. You then have to return your hand to the ‘center’ (straight up) position in order to reset the snap, and additionally wait for a very short cooldown to occur to initiate a snap turn again.

With smooth turning, turning your wrist while in a thumbs-up position will begin rotating you left or right—once you leave a ‘dead zone’ that prevents a turn from occurring until you pass the threshold.

Handling Conflicts Between Movement & Object Interaction Poses

Of course, pointing a finger is too broad of a gesture to be assumed to only just be used for navigation. People will make the same pointing gesture to press buttons or interact with other things in the world just out of habit or their own expectation. It would be pretty jarring to walk up to (but not right up to) a button, point your finger to press it, and then suddenly (and unwantedly) move closer to it in-game (or initiate a teleport unintentionally)!

The way we prevent movement from occuring while the player may be interacting with something is by preventing any movement code from firing off when the hand making the ‘move’ gesture is within a certain range of an interactable object. This range has been tweaked multiple times to get to a good ‘sweet spot’ based on playtesting.

SEE ALSO
‘Resident Evil 8: Village’ Coming to PSVR 2 with Motion Controls

We’ve found that this sweet spot is around 25 cm from the world space location of the bone of the tip of the index finger. Myst is full of interactive objects of various sizes (everything from small buttons to very large levers) arrayed in both wide-open spaces and narrow hallways, so it took us quite a bit of testing to settle on this number. We initially tried 60 cm (about two feet), but that prevented movement from occurring when players still needed to get closer to an object. Likewise, anything below 25 cm caused undesired player movement to trigger when players were trying to grab or touch an object.

One of our best testing areas was the generator room on Myst Island, where you make your way through a narrow entryway and are then immediately greeted by a panel full of buttons. When the interaction testing area was too large, players were unable to move through the entry and toward the panel because it detected buttons within range of the index finger.

That said, 25 cm is what worked specifically for Myst. Other games may need to adjust this number if they want to implement something similar, with their own criteria in mind.

Designing Object Interactions for Hand Tracking

Right now, all grabbable interactions in Myst are built to work with hand tracking—turning valves, opening doors, pressing buttons, turning book pages, and so on.

The interactions piggy-back off what we had already set up for Myst with Touch controllers. There, pressing the grip button automatically blends the in-game mesh representation of your hand into a ‘grabbed’ pose, either putting your hand into a fist (if empty) or grabbing an object. With hand tracking, we’ve added code that will make a qualified guess as to when you have curled your fingers enough to ‘grab’ something and initiate the same logic as mentioned before.

For example, when you’re using hand tracking and your hand hovers over something that’s grabbable, your hand color turns orange (this is exactly what happens when you don’t use hand tracking in Myst VR as well). When you grab an interactable object by beginning to curl your fingers into a fist, an orange sphere replaces your hand mesh and represents where the hand is attached to the object.

The reason why we went with this method instead of making custom poseable meshes for your hands—or allowing your hands/fingers to appear to physically interact with portions of these objects—is because we wanted the interactions to be at parity with what we offer on the Touch controller side for now.

SEE ALSO
‘Horizon Worlds’ is Finally Opening Its Doors to New Regions Outside of North America

Pushing buttons works differently though. There’s no need for abstraction since buttons aren’t grabbable objects, and instead we allow you to simply push a button using generated capsule colliders between each of the finger joints on the poseable hand mesh. You can do all sorts of weird and fun things because of this—like using only your pinky or the knuckle of your ring finger to interact with every button in the game, if you really want to.

This implementation differs slightly from the way Touch controllers interact with buttons in the game in that we usually expect players to use the grip button on their controller to set the hand to be a posed ‘finger pointing’ mesh to get an accurate in-game button press on their end. With hand tracking, there’s obviously significantly more flexibility in the pose you can create with your hand, and therefore significantly more ways to press buttons with the same level of accuracy.

Menu/UI Interactions

For interacting with menus, we ended up going with the same interaction paradigm that Meta uses for the Quest Platform: a two-finger pinch between thumb and index finger, on either hand. This can be used both to open our in-game menu and interact with elements in the menu. No sense in reinventing the wheel here when players are already taught to do this in the OS-level menus when they first enable hand tracking on Quest!

Communicating All of This to the Player

Because hand tracking isn’t as common an input on Quest as Touch controllers, and because there may be some people playing Myst for the very first time (or even playing their very first VR game!), we tried to be considerate with how we communicate all of this information about hand tracking to the player. We made sure to include another version of our “controller diagram” specifically tailored to describe hand tracking interactions (when enabled in Myst), and we show the player specialized notifications that tell them exactly how to move around with their hands.

Additionally, we thought it would be vital to remind the player how to have a smooth hand tracking experience, once enabled. The player is notified in Myst’s menu that hand tracking stability is much better if they ensure they’re in a well-lit room and keep their hands within their field of view.

Meta also informs players that these are key to a well-tracked hand tracking environment, but we recognize that some players might jump into a game not having parsed Meta’s notices about this first, so we’ve chosen to remind folks in case they forgot.

Continue on Page 2: Engine Modifications Made in Unreal »

,

Engine Modifications Made in Unreal

This is where it’s about to get a bit more technical… buckle up!

We’re currently using Unreal Engine 4.27.2 for Myst. Meta has some base code for hand tracking in Unreal—but not enough for a game like ours that requires better gesture and confidence detection. Much of Meta’s gesture detection and physicality libraries for hand tracking are only available in Unity at this time, so we needed to do some ground work on that front to get simple gestures like our ‘thumbs-up’ and ‘finger pointing’ gestures recognized in Unreal.

Additionally, there are some other elements folks will need to implement themselves for a shippable hand tracking project, which I’ll detail below.

System Gesture Spam Fix

The Oculus left-hand system gesture (this is for the menu button) will trigger even if you begin a pinch, instead of waiting to confirm the pinch has been in the same state for a period of time. We fixed this by changing the event in the Oculus Input library to wait for the pinch to complete (wait for the system gesture to fill in its confirmation circle) before firing off a notify event instead of doing so while it’s in progress.

Multi-Platform Build Stability

The Oculus Hand Component being serialized with any Blueprint will cause builds for other platforms (such as Xbox) to break during the nativization step. This is because the Oculus Hand Component is a part of the OculusVR plugin, which is only enabled for Windows and Android and therefore can’t have any of its components referenced in Blueprints when other platforms are built.

Nativization isn’t officially supported in Unreal Engine 5, but for folks in Unreal Engine 4, it may still be beneficial to keep it enabled depending on your project’s needs. Therefore, it isn’t feasible to include a hand component at the Blueprint level for games that are packaged for all platforms.

SEE ALSO
Preview: ‘Shores of Loci’ is a Gorgeous 3D Puzzler Coming to Quest 2 & SteamVR Next Week

Our solution to this is that we’re only spawning and destroying the Oculus Hand Component in C++ on our Touch controllers whenever hand tracking is detected as enabled or disabled, and this functionality is only enabled for Android builds built for Quest. Hand Component source and all of our hand tracking code is excluded from all other platforms.

Unfortunately, this means that if you’re a developer with a Blueprints-only project that’s targeting multiple platforms and making use of nativization in Unreal Engine 4 and you’re considering implementing hand tracking for Quest, you may have to convert your project to a source project in order to avoid nativization issues building for platforms other than Quest.

Custom Whole-Hand Gesture Recognition

There’s no meaningful whole-hand-based gesture recognition built into the Oculus input library (other than finger pinches for system gestures) in Unreal. This means that if you make a thumbs-up gesture or a finger-pointing gesture that requires all other fingers to be tucked in, there isn’t anything built into Unreal that notifies you of that specific gesture happening.

Our solution for this was to implement our own bone rotation detection in the Oculus Hand Component, with adjustable tolerances, to infer when:

  • A finger point (with the index finger) is occurring
  • A grab is occurring
  • A thumbs-up gesture is occurring

All of them get fired off as input events that we can bind to in C++, which is where we house most of our base player controller, character, and Touch controller code.

Gesture & Tracking Stability Adjustments

When implementing and testing Hand Tracking support for Myst in Unreal, we noted some quirks with the tracking stability for certain fingers when they’re occluded by the rest of your hand. For example, if you’re:

  • Grabbing something with all of your fingers facing away from you
  • Pointing your index finger directly away from you

SEE ALSO
HTC Announces Vive Flow Business Edition, Optional 3DOF Controller

In the case of grabbing something with all of your fingers facing away from you, we noted that hand tracking may occasionally think that your pinky finger isn’t enclosed in your fist, as if it had been relaxed slightly. In fact, tracking accuracy of all fingers when in a closed fist with your fingers obfuscated by the back of your hand isn’t particularly high, even when it doesn’t consider tracking confidence to be low. This is an issue for when we expect the player to grab onto things like valves or levers and proceed to turn/move them without letting go too quickly—the hand tracking system might decide you’re no longer grabbing the object because it thinks you relaxed your fingers.

In the case of pointing your pointer finger away from you, sometimes the hand tracking system would consider your middle finger to be more relaxed than it actually was, or even pointing with your pointer finger as well. This was an issue for our navigation and movement systems. If the system no longer thinks you’re pointing with just your pointer finger, it might instead think you’re trying to grab something and stop you from moving, or unwittingly execute a teleport you weren’t ready to initiate.

Our solution for both of these scenarios was to add some individual finger thresholds for how much we’d allow the problematic fingers in these scenarios to be relaxed before we consider a hand as ‘not grabbing’ or ‘not pointing’. More often than not, the tracking system thought fingers were more relaxed than they actually were, instead of the other way around. We built these thresholds right into the place we decide to notify the user of the gestures the hand is making—right into the Oculus Hand Component.

Other Handy Utilities for Oculus Hand Component

We made plenty of modifications to the Oculus Hand Component source for our own custom gesture recognition, but we also made some modifications to it for some utility functions. One of the functions we wrote was a utility to get the closest point on the hand’s collision from some other point in world space. It additionally returns the name of the bone that is closest. We used this utility function for a variety of input verifications for different interactions.

SEE ALSO
Meta Releases ‘First Hand’ Demo to Showcase Quest Hand-tracking to Developers

For what it’s worth, we found that tracking reliability to the wrist bone was most consistent regardless of hand depth, so we did tests to that bone location more often than others.

Closing Thoughts

Hand tracking can be a really powerful, accessible addition to your game. For Myst, it took some work, but we worked ‘smart’ in that we tried to tie it into our existing input systems so we wouldn’t need to make too many overarching changes to the game as a whole. Our goal was to create a user experience that was intuitive and comfortable, without building an entirely separate input system on the back end.

Meta’s branch with Unreal Engine comes with hand tracking support out of the box and can definitely be used by a team capable of making engine changes. With that said, it needs some modifications to get it to recognize useful whole-hand gestures. We’re really looking forward to seeing Meta’s support for hand tracking in Unreal reach parity with what they offer in Unity. In the meantime, teams who are comfortable working with source-based projects in Unreal should find that they’ll have enough flexibility to get hand tracking to fit with their project.

– – — – –

We’re open to hearing what folks think about our process—and learning if there’s any interest in us providing a system like this in Unreal for folks to use and build upon. You can contact us at [email protected] for more information, or contact me (Hannah) on Twitter @hannahgamiel. Thanks for reading!