Start a new topic
Answered

Colliders for hand / fingers

I want to be able to detect gameobject collisions with the hands. Is there an example of adding colliders to the hands or joints? What's the best way to do this?


The easiest seems to be to create a variant of the Default Spaces Hand prefab and add a collider to it. I suppose just having a sphere for the hand would be OK for most uses--but it would be pretty cool to have a somewhat accurate collision mesh for the hand that's properly rigged to work with hand joints.


With a really good mesh you could do some hand occlusion and then we'd be able to have users pick up virtual objects with their hand somewhat convincingly.


What's the proper size/scale of a sphere collider when attaching it to the hand prefab?


Best Answer

Hi,

hand occlusion and collision are two different things, but hopefully all is already there!

The only thing you need to do is import both hand tracking packages (QCHT Core and QCHT Interactions) into your Unity project.  You'll be able to find those directly in the Spaces archive you downloaded. (here is the documentation link explaining this: Spaces Hand Tracking documentation)

Now you'll be able to add a QCHTAvatar into your scene, which contains the complete hand mesh, with colliders to handle all proximal interactions. 

Additionally, you'll be able to play with some cool features like Virtual Force feedback or object Snapping, allowing you to include physics rules in your game play.

Regarding to hand occlusion, you can customise the hand mesh skin, by applying any material on it. This way, the hand will occlude virtual objects if it's in front, and you can adjust the hand looks to your own scene fashion.

I really encourage you to follow the extended hand tracking part of spaces documentation, to discover in depth what hand tracking can offer in Spaces. 

And many more are coming, stay tuned!


I've requested these details from our hand tracking team, and either they or we will post back here once we have that information.

In addition to this--what I really need is a generic rigged hand model that works with the joints. This could be used for collision (but perhaps a bit overkill--capsule colliders attached to the joints might be good enough), but it's necessary for hand occlusion. The hand needs to occlude virtual objects in the scene that are behind it. What I'd do is put an occlusion material on the rigged hand mesh that's driven by the hand tracking joints. Snap Lens Studio has this.

Answer

Hi,

hand occlusion and collision are two different things, but hopefully all is already there!

The only thing you need to do is import both hand tracking packages (QCHT Core and QCHT Interactions) into your Unity project.  You'll be able to find those directly in the Spaces archive you downloaded. (here is the documentation link explaining this: Spaces Hand Tracking documentation)

Now you'll be able to add a QCHTAvatar into your scene, which contains the complete hand mesh, with colliders to handle all proximal interactions. 

Additionally, you'll be able to play with some cool features like Virtual Force feedback or object Snapping, allowing you to include physics rules in your game play.

Regarding to hand occlusion, you can customise the hand mesh skin, by applying any material on it. This way, the hand will occlude virtual objects if it's in front, and you can adjust the hand looks to your own scene fashion.

I really encourage you to follow the extended hand tracking part of spaces documentation, to discover in depth what hand tracking can offer in Spaces. 

And many more are coming, stay tuned!


1 person likes this

Just dumping more questions here--what's the deal with the hand pose? It seems like it's relative to the head? How do you get the hand / joint coordinates in world space?

For those playing along at home--YES! They are relative to the head.

Thanks for that! That's exactly what I need!

BTW just got these samples up and running and they are quite nice!!

Login or Signup to post a comment