this behavior is expected - the camera frame access feature gives you direct access to the device camera without augmentations.
linking a related thread here - Motorola's AR Cast is currently the best option to get a video of your application.
If you want to do it yourself, you would need to combine the camera image texture with the texture rendered by Unity using additional post processing. The performance impact of this would be rather high and the content alignment might also not be straightforward. If you would like to try anyway - a good starting point is Graphics.Blit in the Unity scripting reference.
Thanks Simon, we've done blending with another hardware and I can confirm it is a costly op.
Where does the AR Cast take the stream from? Does it blend it runtime or is there a native stream available somewhere deep inside?
ARCast is using the Android Camera2 API to access the glasses camera then overlays it with the virtual content from the running app during runtime. There is no native stream that's already blended together available.
Is it possible to get a camera stream with AR blended in in 0.11.1? The sample app displays just the RGB camera feed with no virtual objects.