A Brief Look at eXtended Reality Interaction Methods

By Ayush Bhargava

XR methods

Recent advancements in the field of eXtended Reality (XR) have given rise to a new era of interaction devices and techniques. Exploration and experimentation with eXtended Reality interaction methods has yielded some very interesting ways to interact in immersive environments with the goal of making the user experience (UX) more immersive. Some examples include tracking gaze with eye tracking technology, using neural interfaces to track gestures, camera based hand tracking, and obviously, handheld controllers.

Although eye tracking and neural interfaces might require the least amount of physical effort and seem like the ideal choice, they are both limited in their abilities and often require a good bit of training before you can master them. Hand tracking using cameras and controllers, on the other hand, are far more natural and intuitive as they are similar to using our hands or handheld tools to do things in the real world. For example, grasping a cup of coffee or using a spatula to scramble eggs. In short, hand tracking and controllers allow us to interact with the virtual environment in a way that is more familiar to us. In this article we are going to look at some of the most common devices and methods to facilitate 3D interaction in virtual reality (VR) and augmented reality (AR) utilizing cameras and handheld controllers.

Handheld Controllers

To start off, let’s talk about the most common form of XR interactions in virtual worlds - handheld controllers. Since the early days of gaming, we have been using gamepads to control events in virtual worlds and it is only natural to port that over to immersive technology. With massive improvements to tracking technology in the last decade, we saw the rise of handheld controllers for use with VR. Some of the most commonly used controllers include the Wand controllers by HTC, Touch controllers by Oculus, the PlayStation Move controllers and the Valve Index controllers. All four of these provide both positional and rotational tracking, several buttons to control events and interact with virtual objects in the virtual world, and some level of haptic feedback. They provide a very compelling sense of presence and agency positively impacting the immersiveness of the experience. The Valve Index and Oculus Touch controllers also afford some level of finger tracking that further aids natural interaction.

handheld controllers for immersive interactions
Virtual Reality Interaction and Augmented Reality Devices

Popular handheld controllers for immersive interactions

Although handheld controllers seem familiar based on years of gamepad usage, they are mostly prevalent in VR and often require some level of training. Moreover, using them can also get very complex and increase a user’s cognitive load depending on the interactions designed, virtual representation of the controller in VR, i.e. seeing an animated virtual hand instead of the controller, and past experience with unfamiliar devices.

Hand Tracking

The next set of interaction techniques that facilitate natural interaction in virtual worlds is the use of cameras to track hand movements. This is arguably more natural as this is the modality we are most used in everyday life. The technology has been on the rise lately and usually involves specialized cameras like depth and RGB cameras capturing images of your hands as you move them. These images are used to recognize how your finger joints and the palm are moving and how these movements are mimicked in virtual settings in real time.

The level of movements tracked depends on the task a user needs to accomplish in the VR or AR. For example, a lot of AR applications only need to track finger position and pinching gestures as they often only require selection operations. On the other hands, VR applications employ more complex interactions and often include simple gestures and meticulous tracking of each finger joint as well.

xr methods pic 3
Augmented Reality Interaction

Simple pinch selection in Augmented Reality

Source: Crunchfish

One of the first devices to track hands for interaction in VR was the Leap Motion sensor. The sensor had an array of sensors that would capture several types of images of your hands and used image recognition to recognize poses and gestures. The device was extremely versatile and the company has since used this technology to create more complex devices. A major contributing factor to hand tracking becoming increasingly popular is the use of inside-out tracking with the help of cameras in several AR and VR headsets like the Microsoft HoloLens and Oculus Quest headsets. With the help of an array of onboard cameras, these headsets can also track our hand gestures and movements in real-time just like the Leap Motion sensor. However, one of the biggest drawbacks of this technology is that you lose tracking the moment your hands are outside the camera's field of view. This can be frustrating if the experience requires a lot of hand movement. The other limitation of this technology is the complete lack of haptics which can be jarring for complex interactions. Some experiences rely on proprioceptive and kinesthetic information to perform effectively like surgery training simulations.

xr pic 4

Hand tracking in Virtual Reality

Source: Oculus

xr methods pic 5

Hand tracking in Augmented Reality

Source: Microsoft

Experimental Devices

We are also seeing a rise in more customized devices that aim to afford more natural interaction with proprioceptive and kinesthetic information in the form of directed force feedback, haptic sensations from multiple sources, and shapes that closely resemble the virtual subject of the interaction. Most of these devices are not commercially available yet and are often byproducts of research explorations. Some good examples of these types of devices include the haptic revolves that aims to provide haptic feedback for touching surfaces in VR and this 3D printed fishing rod for VR. Such devices are often created by combining different technologies like lighthouse tracking, 3D printing, actuators and smart tangibles. The end result is always impressive and highly immersive. Big tech companies have also invested in research relating to such devices and a recent byproduct of this interest are the haptic gloves presented at CES 2021.

I think it is safe to say that the future of interaction in virtual worlds is going to be very exciting and I for one am very excited. Moreover, I can’t wait to try some experiences that employ some of these interaction techniques in conjunction with other immersive technologies like haptic backpacks and olfactory senses. It might take a hot minute for such devices to be easily accessible but believe me it’ll be worth it!

XR User Experience