Mixed Reality Capture can obtain "Camera stream with AR overlay" in HoloLens2.
As shown in the image, the position of the CG object as seen by the device wearer and the position of the CG object as seen by a third party on the display are required to match as closely as possible.
I want to send a "Camera stream with AR overlay" using WebRTC.
ARCemeraManager Camera Frame Access Sample( https://docs.spaces.qualcomm.com/unity/samples/CameraFrameAccessSample.html ) is only camera stream.
The AR Foundation API does unfortunately not support an AR overlayed camera stream and neither does Snapdragon Spaces.
We already have a feature request recorded for an API in addition to ARFoundation to provide this from Snapdragon Spaces side. This feature does not have a public roadmap yet though.
We had similar requests already:
and in theory you should be able to combine the rendered texture from Unity with the texture coming from the Camera Frame Access feature with a couple limitations on blending depth.
There are some factors to look out for: