I created this video, https://youtu.be/wenL3o45eEI as an initial test, and to see how I could best record video. I used AR Cast to send the video feed locally to my computer and recorded from there. Two things I noticed, 1) when you drop into immersive mode, the recording stops recording the real world/passthrough and 2) The framerate is quite low, and the video appears quite choppy. I've seen some videos from the AWE presentations that looked much smoother and with the real world still visible. Would love to see if there are best practices established here, or if you had any tips.
I would also ask about this, I tried local method but it lost FPS often.
I also tied the cloud streaming... but I could not login it with my account (even I apply another new one).
I also saw the AWE and twitter videos, some video record the virtual object without transparent.
How to do this?
Because current AR Cast record result virtual object might be 40%~50% than I saw in glass.
Some videos from AWE were shot directly through the glasses by positioning a digital camera behind the lens, while others from AWE were shot using AR Cast, screen recording the video stream from AR Cast on a Mac.
We really need a recording app that doesn't have to cast it to another device. AR Cast is heavy and it doesn't work well anymore with our app since we added hand tracking. It loses a lot of fps and hand tracking stops after a while when AR Cast is on.
Lenovo maintains the AR Cast app, and we do not provide a Qualcomm alternative at this time for Spaces. If you would like to submit a feature request, please do so with enough detail at our Product Roadmap page at https://spaces.qualcomm.com/snapdragon-spaces-roadmap/