The AR Foundation API does unfortunately not support an AR overlayed camera stream and neither does Snapdragon Spaces.
We already have a feature request recorded for an API in addition to ARFoundation to provide this from Snapdragon Spaces side. This feature does not have a public roadmap yet though.
We had similar requests already:
https://support.spaces.qualcomm.com/a/forums/topics/72000788415
https://support.spaces.qualcomm.com/a/forums/topics/72000786435
and in theory you should be able to combine the rendered texture from Unity with the texture coming from the Camera Frame Access feature with a couple limitations on blending depth.
There are some factors to look out for:
Yudaiasai0613tsukuba
Mixed Reality Capture can obtain "Camera stream with AR overlay" in HoloLens2.
https://learn.microsoft.com/ja-jp/windows/mixed-reality/develop/advanced-concepts/mixed-reality-capture-overview
As shown in the image, the position of the CG object as seen by the device wearer and the position of the CG object as seen by a third party on the display are required to match as closely as possible.
I want to send a "Camera stream with AR overlay" using WebRTC.
ARCemeraManager Camera Frame Access Sample( https://docs.spaces.qualcomm.com/unity/samples/CameraFrameAccessSample.html ) is only camera stream.