Unfortunately there is no testing with webcam available for the snapdragon spaces sdk.
While you can not test perception features such as anchors, or plane detection in the editor, you can use XRIT device simulation for hand tracking. Take a look at the instructions:
https://docs.spaces.qualcomm.com/unity/handtracking/BasicSceneSetup.html#in-editor-simulation
https://docs.unity3d.com/Packages/[email protected]/manual/xr-device-simulator.html
We do have plans for a host simulator on windows with Unity and Unreal engine. Feel free to upvote it on our public roadmap: https://portal.productboard.com/bdx5dufjd6ka3jnwpbqldqmu/c/29-pc-host-simulator-for-unity?utm_medium=social&utm_source=portal_share
Can you specify what the RealSense camera would be used for - Editor testing or in combination with a Snapdragon Spaces device?
Yes, that's right. I tried to ask you to use Intel RealSense if you used a webcam
Using RealSense or any depth camera is currently not possible.
I would need to file a feature request for it. Could you give some more details:
1. I don't need RealSense for any special reason, but RealSense is what I currently have.
2. This is for testing on the editor, no other purpose.
Thanks you
Rlarbwls113
I am currently developing using the Unity editor.
I want to use the RGB camera using a webcam in Play Mode without building. Is there a way?
Furthermore, I would like to support IR Camera like Intel RealSense.
Do you have any plans to apply?