Hi are you referring to QCHT or XRI hand tracking with your question?
Well, I mean the Interaction version 4 pre 9 package that comes with the latest SDK 0.12.1, I'm not sure what to say, I think it's both or what? The docs are for the version 3, but when using it the hand don't get drawn for some reason
XRIT is offering it's own hand tracking solution as well, which is what I was referring to with the question.
We are working on updating the documentation on QCHT 4 at the moment. The hands simulation in the editor as it was with version 3.4 is outdated and replaced with XRIT device simulator. See here for some info on how to set it up in the meantime. We are also investigating a potential bug with the simulator not letting go of grabbed objects fyi.
As a sample of how to get input you could for example take a look at QCHT Sample - Draw, select DrawingManager > LeftPencilPointer and see the input used by the XRIT in ActionBasedController.cs
Are the input mappings enough or do you need access to the raw gesture data?
Well, before you could call QCHTInput.Getsure() or something to check if the user is pinching, for example, what is the closest way to do this now?
You can get the kind of gesture via XRHandTrackingManager.RightHand.GestureId and I believe the certainty of the gesture can be accessed via XRHandTrackingManager.RightHand.GestureRatio. Will need to confirm with the team about that. You can check gestures via the new Unity Input System and XRIT. You can set them up similar to this where trigger corresponds to pinch/select and grip to grab: On code side it would look something like this (Right hand grab action): if (RightGrabAction.WasPressedThisFrame()) { //do sth.. }
I realized I needed to read the docs for the new input system first only now, it was really confusing before I did. Thanks for the help!
Pa
I am tying to use latest version of the interactions package, which is using XRI. but I don't see how do I access the gesture data from code?