The CFA feature gives access to the raw camera frames. If you wanted to get a combined real world and virtual (augmented) texture, you would need to merge them together manually.
A hacky way with some alignment issues would be rendering both of them into UI elements with raw textures and overlaying them on a canvas.
A more complicated approach would merge them together with Graphics.Blit, taking into account camera extrinsics & intrinsics.
We do not have this on our roadmap currently. I will be filing this as a feature request for our product team to consider.
Shiota. masayoshi
AR Foundation/ARCamera can obtain "Camera stream with AR overlay" in Android/iOS.
https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@5.1/manual/index.html
*in page top
(direct link : https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@5.1/manual/images/sample-simple-ar.png)
I want to get "Camera stream with AR overlay" like this link.
Do you have any plans to implement it?
I want to send a "Camera stream with AR overlay" using WebRTC.
ARCemeraManager Camera Frame Access Sample( https://docs.spaces.qualcomm.com/unity/samples/CameraFrameAccessSample.html ) is only camera stream.