Start a new topic

Understanding the Changes Required for Hand Interaction Features from Version 0.19 to 0.21

I'm interested in understanding the specific changes and the reasons behind them with the update from version 0.19 to 0.21. 

Previously, in version 0.19, if the QCHT Package was installed from the Main Menu Scene, hand interaction capabilities were automatically available. 

But, with the update to version 0.21, it's now required to manually update the prefab to enable hand interactions.


Additionally, within the Hand Tracking Scene, changing the controller allows for normal transitions between interaction controllers, such as from hand to controller, controller to gaze pointer, and gaze pointer back to hand. Is there a specific reason why this functionality is limited to the Hand Tracking Scene and excluded from other areas?

1 Comment

As MR devices become more important, we decided to make controller/device pointer input the default input modality for the Snapdragon Spaces Samples. 

Some points here:

  • The input modalities available differ from scene to scene now. 
  • The XRIT sample scene allows using all input methods as before in 0.19.1.1. If you are interested in that setup please take a look at the setup in the sample scene or specify your need further. 
  • The QCHT samples are affected by this change as well. When opening the QCHT sample scene the default input modality should switch to Hand Tracking, with other input options not being available. I suspect that the modifications you made allowed you to use other input modalities within the QCHT Samples. What modifications did you need to make before getting this to work? 
  • All other scenes are using controller/device pointer as default. 
  • Can you specify which device you are using? In case it is VRX please be warry that 0.21.0 isn't officially supported on the device yet. More info on that is available at Think Reality VRX SDK Compatibility.
Login or Signup to post a comment