Start a new topic

2D Mode with 6DOF Tracking done entirely by Glasses AR Chip independent of PC GPU/CPU

Hi Qualcomm


I am excited for the upcoming Snapdragon Summit in November and have a feature request related to 2D Mode Productivity use case outlined below:

 

One of my primary use cases on PC/Mac/Phones will be 2D Mode Productivity i.e. screen output via DP alt-mode.

At present 2D Mode does not react to 6DOF tracking sensor data.

 

I would like to request this ability to utilise 6DOF sensors in 2D Mode.

In particular, the only way this would make sense is if the SOC on the glasses would feed tracking sensor data into the Display Controller which would shift 2D image based on Head/Body Position.

This perspective rotation of 2D screen input would be done by the chips integrated into glasses completely independently of CPU/GPU on PC/Mac/Phone.

In other words, Task Manager should report 0-3% GPU/CPU utilisation when i am looking at 2D desktop whilst Head Tracking is fully functional on the glasses.

 

This request does not apply to 3D Mode use case with OpenXR enabled apps or 3D Mode Gaming.

It does, however, still apply to 2D Mode Gaming.

 

Is the above something Qualcomm are working on specifically for 2D Mode use on PC/Mac?

Will this be supported by XR2 with internal firmware?

Or will this be supported by the next AR chipset?

Could this be expected in 2022?

 

Much Thanks


Thanks for the question,

     while there is a DSP unit on the glasses (that's available for programming, also one on phone), the glasses just pass info to the phone. The entire rendering pipeline resides on the phone, and the rendered scene is then blitted to the glasses display. It is possible to grab the 6DOF parameters in the program, but there's nothing on the glasses that's doing "rendering". The current use-case that we're working with is a tethered/wireless headset connected to a phone/android-device that's doing most of the work. Different architectures using Qualcomm XR chips (like the Quest 2) are self contained but essentially there's a CPU/GPU unit and the HMD display. 


We are fully behind using OpenXR to use a uniform standard so that it's easy for hmd manufacturers to utilize Qualcomm (and any other conforming SOC) to easily create apps using engines that conform to OpenXR interfaces. Generic portability. OpenXR does have an overlay subsystem that could do what you describe, but you'd have to code that in the rendering engine, and that's always going to be on the CPU/GPU.


You might review the non-Snapdragon Spaces software for the Lenovo glasses - they utilize the glasses as an external 2D display - whereas Spaces is intended for XR applications.


- Snapdragon Spaces Support



Hi Ron Appreciate you response. Perhaps it may help if I distill my query further to clarify what I am trying to achieve. 3D rendering will not be necessary on the glasses. However, what will be required is 2D Screen Position Adjustment just like what all monitors have(i.e. horizontal and vertical). Here come the questions: Is Screen Position Adjustment for glasses available through api? Can this adjustment be fast enough to react to head movement? Ability to shift screen completely off visible FOV will be essential. Is Qualcomm in full control of Display Controller firmware to add this ability if it is not currently present? Using up battery for 3D perspective calculations with Outlook and Excel is completely unnecessary. Everything should be doable with the current Display Controller. A little 2D reactivity to head movements is what i am hoping to organise. A small Win/Mac app should be able to instruct the Display Controller if api is available. Ideally, this app should run on internal DSP. Hoping Qualcomm could assist. Thanks

So you can see I'm working backwards through the forum posts :-)


While I wait for a response from Lenovo - what you'd need to do this is 6DOF information ( the head tracking information) to get where the user is moving their head position and direction. So while you might be able to use DCC/CI to move the display, you'd near certainly need OpenXR functionality to get the 6DOF info (at least, in a hardware agnostic manner) - since the render pipeline is already hooked up to the 6DOF - anything you render (3D or 2D) can be placed and would respond as expected when the user moves their head. If you look up head-locked display you'll find more information on that. It's usually used for HUDs but you do have frame-by-frame control over the position.  Additionally the render pipeline is designed to not-tear when the user moves their head (each eye is independant and their prediction algorithms are involved & throw in time-warp, space-warp, etc), something that would not occur if you're directly manipulating the display parameters outside of the rendering sync.


- Ron

Login or Signup to post a comment