Start a new topic

2D Mode with 6DOF Tracking done entirely by Glasses AR Chip independent of PC GPU/CPU

Hi Qualcomm


I am excited for the upcoming Snapdragon Summit in November and have a feature request related to 2D Mode Productivity use case outlined below:

 

One of my primary use cases on PC/Mac/Phones will be 2D Mode Productivity i.e. screen output via DP alt-mode.

At present 2D Mode does not react to 6DOF tracking sensor data.

 

I would like to request this ability to utilise 6DOF sensors in 2D Mode.

In particular, the only way this would make sense is if the SOC on the glasses would feed tracking sensor data into the Display Controller which would shift 2D image based on Head/Body Position.

This perspective rotation of 2D screen input would be done by the chips integrated into glasses completely independently of CPU/GPU on PC/Mac/Phone.

In other words, Task Manager should report 0-3% GPU/CPU utilisation when i am looking at 2D desktop whilst Head Tracking is fully functional on the glasses.

 

This request does not apply to 3D Mode use case with OpenXR enabled apps or 3D Mode Gaming.

It does, however, still apply to 2D Mode Gaming.

 

Is the above something Qualcomm are working on specifically for 2D Mode use on PC/Mac?

Will this be supported by XR2 with internal firmware?

Or will this be supported by the next AR chipset?

Could this be expected in 2022?

 

Much Thanks


Hi Ron


Thank you for investigating along with me.

Screen mirrored into a HUD is an interesting proposition, however, I suspect this will use a lot of GPU power.


Before succumbing to the OpenXR render pipeline, I would very much like to explore the other 3 pathways:

- DDC/CI code running on the internal DSP - this will be ideal and completely transparent. Hopefully Lenovo will provide info on screen tearing and whether it is responsive enough at 60-90Hz.

Wondering if 6DoF data is available to the DSP?

- DDC/CI code running on PC/Mac.

Is OpenXR supported by Spaces on PC at this stage?

Is OpenXR the only way to access 6DoF data?

- Use GPU brand specific api for screen position adjustment. i.e. generally screen position can be set manually from GPU settings and there should be a way to do it with an api.

This pathway will definitely have less screen tearing.


Which of these three options are more realistic?

Thanks


Thanks

-  DDC/CI code running on the internal DSP - this will be ideal and  completely transparent. 

.....So yes and on most Qualcomm chips there's a programmable DPS chip, but each one might have one of different capabilities - plus it might already be in use (like for hand tracking)


Wondering if 6DoF data is available to the DSP?

.....You would need to access that information through the OpenXR interface and then pass it along.

- DDC/CI code running on PC/Mac.

.... Spaces runs on Android

Is OpenXR supported by Spaces on PC at this stage?

.... No, although running with a PC and using the glasses as the XR display is something we're looking at, but it's not POR

Is OpenXR the only way to access 6DoF data?

.... currently yes

-  Use GPU brand specific api for screen position adjustment. i.e.  generally screen position can be set manually from GPU settings and  there should be a way to do it with an api.

...Spaces is designed to be hardware agnostic, but in general we'll support Spaces Ready hardware. Currently that's just the Motorola Edge+ plus Lenovo A3. Theoretically an OpenXR compatible hardware might work, but Spaces is designed to add layers on top of the basic OpenXR functionality. But there's more than one flavor of OpenXR that could work (they can all co-exist)


Which of these three options are more realistic?


...So, while you might be able to feed DCC/CI code to the display, it would be tricky to sync them as each eye in an OpenXR frame has it's own timing and interactions with things like space/time warp, etc. The left eye gets rendered with a particular 6DOF location and then the right eye with a (potentially) different 6DOF value. Making small changes would work, large ones would cause issues.


....You'd still need to use OpenXR to fetch the 6DOF position. I don't know if there's a way to get the "predicted" position for the right eye. Since you're already in the middle of the OpenXR rendering path, it'd be easy to adjust the position of a rendered object and let the render pipeline worry about the left/right 6DOF differences as it's built in. There are OpenXR 2D layers that are designed for headlocked/displaylocked purposes.

Thanks Ron


OpenXR not being available on PC makes it a bit of a bummer as the premise is to use full pro apps as opposed to Android versions.


Come to think of it, 3DoF should be sufficient in this case. Full OpenXR will be over and beyond overkill here.

Could Qualcomm expose just the 3DoF to PC through a different interface i.e. serial or similar(still over same usb-c cable)?

Does the DSP have direct access to the sensors or does everything have to go through Android?

i.e. Could DSP code expose 3DoF to PC?

Unless the space is fully taken up by hand tracking.


As for screen tearing and time sync, as per my understanding there should be no eye syncing in 2d output mode. Happy to be corrected here.

Could Qualcomm assist with testing DDC/CI responsiveness in 2d mode?

It may just work sufficiently well without complicating things.


Hoping there is another way for PC 3DoF whilst OpenXR is being worked on.

OpenXR has a few versions that run on a PC (Steam, Monado, etc.) basically these get installed on your machine and at run time you select the one you want.


Since we're pretty heavily invested in OpenXR support, I can't see us providing a non-OpenXR interface - we had our own SDK for a while and it's been deprecated in favor of our OpenXR implementation.


Accessing/programming our DSP's is done through our Hexagon SDK.  Other than knowing it exists and it's used in machine learning tasks (like hand tracking, CV app, etc.) I  don't have any experience in utilizing it in an app. If you're not running our OpenXR services then I think there is likely some programming space.


Tearing/misalignment will occur when there's a change in the render target *or* 6DOF values as it's being blitted to the screen. Unless these commands are queued up and executed in alignment with vsync *and* the warp in effect you can get so tearing or eye-mismatch. Note that 6DOF values for each are are (possible) different. The left eye is the "current" 6DOF and the right eye gets a "predicted" 6DOF.


I'm looking at DDC/CI but as this is a Lenovo product if I get some positive response I'll be pointing you there as this is beyond the reach of "Snapdragon Spaces". That said - I like creative solutions :-)


- Ron

No worries Ron

Thanks for the info on Hexagon.


Awaiting response on DDC/CI.

Let us know if able to test on reference design glass models.


Cheers

Thanks for the question,

     while there is a DSP unit on the glasses (that's available for programming, also one on phone), the glasses just pass info to the phone. The entire rendering pipeline resides on the phone, and the rendered scene is then blitted to the glasses display. It is possible to grab the 6DOF parameters in the program, but there's nothing on the glasses that's doing "rendering". The current use-case that we're working with is a tethered/wireless headset connected to a phone/android-device that's doing most of the work. Different architectures using Qualcomm XR chips (like the Quest 2) are self contained but essentially there's a CPU/GPU unit and the HMD display. 


We are fully behind using OpenXR to use a uniform standard so that it's easy for hmd manufacturers to utilize Qualcomm (and any other conforming SOC) to easily create apps using engines that conform to OpenXR interfaces. Generic portability. OpenXR does have an overlay subsystem that could do what you describe, but you'd have to code that in the rendering engine, and that's always going to be on the CPU/GPU.


You might review the non-Snapdragon Spaces software for the Lenovo glasses - they utilize the glasses as an external 2D display - whereas Spaces is intended for XR applications.


- Snapdragon Spaces Support



Hi Ron Appreciate you response. Perhaps it may help if I distill my query further to clarify what I am trying to achieve. 3D rendering will not be necessary on the glasses. However, what will be required is 2D Screen Position Adjustment just like what all monitors have(i.e. horizontal and vertical). Here come the questions: Is Screen Position Adjustment for glasses available through api? Can this adjustment be fast enough to react to head movement? Ability to shift screen completely off visible FOV will be essential. Is Qualcomm in full control of Display Controller firmware to add this ability if it is not currently present? Using up battery for 3D perspective calculations with Outlook and Excel is completely unnecessary. Everything should be doable with the current Display Controller. A little 2D reactivity to head movements is what i am hoping to organise. A small Win/Mac app should be able to instruct the Display Controller if api is available. Ideally, this app should run on internal DSP. Hoping Qualcomm could assist. Thanks

So you can see I'm working backwards through the forum posts :-)


While I wait for a response from Lenovo - what you'd need to do this is 6DOF information ( the head tracking information) to get where the user is moving their head position and direction. So while you might be able to use DCC/CI to move the display, you'd near certainly need OpenXR functionality to get the 6DOF info (at least, in a hardware agnostic manner) - since the render pipeline is already hooked up to the 6DOF - anything you render (3D or 2D) can be placed and would respond as expected when the user moves their head. If you look up head-locked display you'll find more information on that. It's usually used for HUDs but you do have frame-by-frame control over the position.  Additionally the render pipeline is designed to not-tear when the user moves their head (each eye is independant and their prediction algorithms are involved & throw in time-warp, space-warp, etc), something that would not occur if you're directly manipulating the display parameters outside of the rendering sync.


- Ron

Login or Signup to post a comment