Thursday, January 5, 2017

State of the Art in Positional Tracking and Haptic Feedback for Consumer VR

Positional Tracking


A truly immersive and interactive VR experience requires full positional tracking. Rotational tracking, which is about detecting when the user turns his head, is already a standard feature. Additionally, some systems such as HTC Vive offer also full positional tracking of the user’s head (i.e., the headset) and hands (i.e., the controllers). However, precise and low-latency positional tracking of the user’s other body parts, such as the fingers, is not yet provided by commercially available VR systems.

Perhaps the most advanced commercially available system is HTC Vive’s Lighthouse positional tracking platform. The Lighthouse platform uses base stations that are small boxes placed around a room. The boxes flood the room with non-visible light. Photosensors on the tracked devices (the headset and the controllers) intercept the light and figure out where they are in relation to the boxes. Use of multiple boxes makes it possible for the system to determine where the devices are in the 3D space. The limitation of Lighthouse is that it tracks only the controllers and the headset – tracking of the user’s fingers and body parts other than the hands and the head is missing.

PlayStation VR uses a single camera to track the movements of the PS Move controllers and the headset via visible light [JSG]. Due to the use of a single camera, PS VR does not allow the user to roam around the physical room in order to move within the virtual world (which is something that the Vive can do with its base stations).

Oculus Rift comes with a single Constellation tracking camera, which uses infrared light for positional tracking. Although the system is primarily meant for seated and standing experiences, it can support room-scale experiences if more tracking cameras are added (Oculus recommends at least three Constellation cameras). However, the tracking is not as versatile as that of the Vive [DT]. The Oculus Touch system, which is sold separately from the headset, adds a secondary Constellation camera and two controllers that like the Rift headset, are equipped with infrared LEDs [TV]. One interesting feature of the controllers is that they are capable of detecting whether the user’s fingers are resting on the controller’s surface. This means that games can detect the difference between a lightly balled fist, a pointing finger, and a thumbs-up gesture [AT-1].

Haptic Feedback


Oculus Touch can also provide haptic feedback through vibration [Oculus]. Vibration is a good start since truly immersive VR experiences must take advantage of also other human perceptual senses than vision and hearing [SA]. Vibrations as haptic feedback is the current state of the art in consumer VR. Other types of tactile feedback that are needed include the feeling of pressure, touch, and texture. And besides tactile feedback, also kinesthetic feedback is needed, including the feeling of the size and weight of objects and their position relative to the hand, arm, joints, and body.

Microsoft Research has developed two experimental fully-tracked controllers that Microsoft calls NormalTouch and TextureTouch [RV2]. These controllers can provide both tactile and kinesthetic feedback [MS]. NomalTouch uses three servo motors to operate a small disc with tilt and extrusion movements. TextureTouch makes use of 16 servos to operate a 4x4 array of small block that move up and down to correspond to virtual shapes and structures.

Another example of advancements in haptic feedback is the Dexmo robotic exoskeleton glove from Dexta Robotics [RV3]. The glove can provide force feedback to simulate the act of touching objects in virtual reality. Yet another interesting approach comes from a Tokyo-based company H2L that has developed an advanced haptic feedback armband that can target the muscles in the user’s arm that control each finger and deliver precise responses that mirror onscreen actions [TC].

Tesla Studios is a startup that is developing a full-body haptic feedback suit called the Teslasuit, which uses neuromuscular electrical simulation (i.e., mild localized electric shocks) that trick the senses [EG]. Tesla Studios claims that the suit can, besides basic interactions with objects in the virtual world, also mimic for example the impact of bullets and explosions (which does not, by the way, sound entirely pleasant).

Ultrahaptics is a startup that uses an array of ultrasound emitters and clever algorithms that enable VR users to feel and manipulate virtual objects in the air [BB]. The limitation of the technology is that while ultrasound can simulate the sensation of touching the outline of an object, it cannot create the illusion of solidity – the user is able to push his fingers through the area of vibration.

There exists also a long list of other companies working on haptic controllers for VR. Some further examples are available foe example in [VT].

Inside-out Tracking


The type of tracking used by the HTC Vive, PlayStation VR, and Oculus Touch is called outside-in tracking since it relies on external cameras or base stations. In contrast, inside-out tracking places a tracking camera within the item being tracked (i.e., the headset). Inside-out tracking could be a game changer for VR (and especially AR) since it enables a self-contained headset not requiring external sensors. As an example, Microsoft HoloLens has multiple cameras around the headset, on the front and sides. They can capture video of the surroundings, track the user’s hands and gestures, and track head movements together with the headset’s other sensors [Wareable]. Another example comes from Oculus, which is working on a prototype headset called Santa Cruz that provides inside-out tracking technology [RV]. Santa Cruz uses four outward-facing cameras built into the device itself (embedded in the four corners on the front of the headset) combined with computer vision algorithms to let the headset calculate the user’s position and head angle [AT-2]. The cost of this approach is that running computer vision algorithms for input from four cameras adds overhead to an untethered headset that is already at a disadvantage when it comes to processing power compared to PC-based VR even without inside-out tracking.

Summary


When it comes to positional tracking and haptic feedback, the current state of the art in consumer VR is outside-in tracking of the headset and controllers, and use of basic vibrations for tactile feedback. Important future additions that different companies are working on include advancements in inside-out tracking, and advancements in both kinesthetic feedback and tactile feedback.

References


[AT-1] Oculus finally answers VR’s “where are my hands” problem, and it’s great, http://arstechnica.com/gaming/2016/10/why-oculus-has-my-favorite-vr-hand-tracking-controller/

[AT-2] Rift goes wireless: Ars walks around in Oculus’ Santa Cruz VR prototype, http://arstechnica.com/gaming/2016/10/rift-goes-wireless-ars-walks-around-in-oculus-santa-cruz-vr-prototype/

[BB] Meet the Man Who Made Virtual Reality 'Feel' More Real, https://www.bloomberg.com/news/features/2016-02-03/uk-startup-ultrahaptics-is-making-virtual-reality-feel-more-real

[DT] Spec Comparison: Does the Rift’s Touch Update Make It a True Vive Competitor? http://www.digitaltrends.com/virtual-reality/oculus-rift-vs-htc-vive/

[EG] Teslasuit does full-body haptic feedback for VR, https://www.engadget.com/2016/01/06/teslasuit-haptic-vr/

[JSG] PlayStation VR Tracking Guide, http://jobsimulatorgame.com/psvrfaq/

[MS] NormalTouch and TextureTouch: High-fidelity 3D Haptic Shape Rendering on Handheld Virtual Reality Controllers, https://www.microsoft.com/en-us/research/wp-content/uploads/2016/10/NormalTouch-TextureTouch-VR_3D_Shape_Controllers-2016.pdf

[Oculus] Haptic Feedback, https://developer3.oculus.com/documentation/pcsdk/latest/concepts/dg-input-touch-haptic/

[RV] Hands-on: Oculus’ Wireless ‘Santa Cruz’ Prototype Makes Standalone Room-scale Tracking a Reality, http://www.roadtovr.com/hands-on-oculus-wireless-santa-cruz-prototype-makes-standalone-room-scale-tracking-a-reality/

[RV2] Microsoft Research Demonstrates VR Controller Prototypes With Unique Haptic Technology, http://www.roadtovr.com/microsoft-research-haptic-vr-controller-prototype-normaltouch-texturetouch/

[RV3] Dexta Shows Off Latest Exoskeleton Gloves That Let You Touch VR, http://www.roadtovr.com/dexta-dexmo-exoskeleton-vr-glove-haptic-force-feedback-touch-vr/

[SA] Not the future: I tried the HTC Vive VR headset and was seriously underwhelmed, http://siliconangle.com/blog/2016/01/24/not-the-future-i-tried-the-htc-vive-vr-headset-and-was-seriously-underwhelmed/

[TC] H2L Launches Their Next-Gen UnlimitedHand VR Haptic Controller, https://techcrunch.com/2015/09/21/h2l-launches-their-next-gen-unlimitedhand-vr-haptic-controller/

[TV] Oculust Touch Review: The Oculust Rift Is Finally Complete, http://www.theverge.com/2016/12/5/13811232/oculus-touch-rift-vr-motion-controller-review

[VT] List of Haptic Controllers under Development for Virtual Reality, http://www.virtualrealitytimes.com/2015/03/13/list-of-haptic-controllers-virtual-reality/

[Wareable] Microsoft HoloLens: Everything you need to know about the $3,000 AR headset, https://www.wareable.com/microsoft/microsoft-hololens-everything-you-need-to-know-about-the-futuristic-ar-headset-735

[XR] Marker-less, Inside-Out Tracking, http://xinreality.com/wiki/Marker-less,_Inside-Out_Tracking

No comments:

Post a Comment