Start a new topic

Camera Stream with AR overlay ( likes Mixed Reality Capture )

Mixed Reality Capture can obtain "Camera stream with AR overlay" in HoloLens2.

https://learn.microsoft.com/ja-jp/windows/mixed-reality/develop/advanced-concepts/mixed-reality-capture-overview


As shown in the image, the position of the CG object as seen by the device wearer and the position of the CG object as seen by a third party on the display are required to match as closely as possible.


I want to send a "Camera stream with AR overlay" using WebRTC.

ARCemeraManager Camera Frame Access Sample( https://docs.spaces.qualcomm.com/unity/samples/CameraFrameAccessSample.html ) is only camera stream.

1 Comment

The AR Foundation API does unfortunately not support an AR overlayed camera stream and neither does Snapdragon Spaces. 

We already have a feature request recorded for an API in addition to ARFoundation to provide this from Snapdragon Spaces side. This feature does not have a public roadmap yet though. 


 

We had similar requests already:

https://support.spaces.qualcomm.com/a/forums/topics/72000788415

https://support.spaces.qualcomm.com/a/forums/topics/72000786435


and in theory you should be able to combine the rendered texture from Unity with the texture coming from the Camera Frame Access feature with a couple limitations on blending depth. 


There are some factors to look out for:

  • The camera lens distortion may be a factor. You may need to undistort the image for better alignment. You should be able to undistort the image using parameters presented by AR Foundation API and knowing the distortion model (for A3 radial with 6 parameters for example). 
  • The sync between RGB camera frames, device pose, and Unity virtual camera rendering. 
  • The position of the physical camera on the device

1 person likes this
Login to post a comment