Vision Pro Will Have an Avatar Webcam to Work with Popular Video Apps

In addition to offering immersive experiences, Apple says that Vision Pro will be able to run most iPad and iOS apps out of the box with no changes. For video chat apps like Zoom, Messenger, Discord, and others, the company says that an ‘avatar webcam’ will be supplied to apps, making them automatically able to handle video calls between the headset and other devices.

Apple says that on day one, all suitable iOS and iPad OS apps will be available on the headset’s App Store. According to the company, “most apps don’t need any changes at all,” and the majority should run on the headset right out of the box. Developers will be able to opt-out from having their apps on the headset if they’d like.

For video conferencing apps like Zoom, Messenger, Discord, Google Meet, which expect access to the front-camera of an iPhone or iPad, Apple has done something clever for Vision Pro.

Instead of a live camera view, Vision Pro provides a view of the headset’s computer-generated avatar of the user (which Apple calls a ‘Persona’). That means that video chat apps that are built according to Apple’s existing guidelines should work on Vision Pro without any changes to how the app handles camera input.

How Apple Vision Pro ‘Persona’ avatars are represented | Image courtesy Apple

Persona’s use the headset’s front cameras to scan the user’s face to create a model, then the model is animated according to head, eye, and hand inputs tracked by the headset.

Image courtesy Apple

Apple confirmed as much in a WWDC developer session called Enhance your iPad and iPhone apps for the Shared Space. The company also confirmed that apps asking for access to a rear-facing camera (ie: a photography app) on Apple Vision Pro will get only black frames with a ‘no camera’ symbol. This alerts the user that there’s no rear-facing camera available, but also means that iOS and iPad apps will continue to run without errors, even when they expect to see a rear-facing camera.

There’s potentially other reasons that video chat apps like Zoom, Messenger, or Discord might not work with Apple Vision Pro right out of the box, but at least as far as camera handling goes, it should be easy for developers to get video chats up and running using a view of the user’s Persona.

It’s even possible that ‘AR face filters’ in apps like Snapchat and Messenger will work correctly with the user’s Apple Vision Pro avatar, with the app being none-the-wiser that it’s actually looking at a computer-generated avatar rather than a real person.

Image courtesy Apple

In another WWDC session, the company explained more about how iOS and iPad apps behave on Apple Vision Pro without modification.

Developers can expect up to two inputs from the headset (the user can pinch each hand as its own input), meaning any apps expecting two-finger gestures (like pinch-zoom) should work just fine, but three fingers or more won’t be possible from the headset. As for apps that require location information, Apple says the headset can provide an approximate location via Wi-Fi, or a specific location shared via the user’s iPhone.

SEE ALSO
Apple Unveils Vision Pro, Its First XR Headset

Unfortunately, existing ARKit apps won’t work out of the box on Apple Vision Pro. Developers will need to use a newly upgraded ARKit (and other tools) to make their apps ready for the headset. This is covered in the WWDC session Evolve your ARKit app for spatial experiences.

,