Skip to content

Apple ARKit To Get People Occlusion, Body Tracking, High Level 'RealityKit' Framework

Apple ARKit To Get People Occlusion, Body Tracking, High Level 'RealityKit' Framework

At its annual WWDC conference today Apple announced big new updates to ARKit, including people occlusion and body tracking.

The company also announced a high level AR framework called RealityKit and easy to use AR creation tool Reality Composer.

People Occlusion, Body Tracking

With previous versions of ARKit, and with Google’s ARCore, virtual objects often show up on top. If someone walked in front of the object it would still render as if the person were behind it. This looks wrong and instantly breaks the illusion that the virtual object is really in the environment.

ARKit 3 introduces real time human occlusion, which means if a person walks in front of a virtual object it will appear behind them.

This understanding of human movement can also be used for body tracking, enabling use cases such as animating a virtual character in real time from human movement.

RealityKit & Reality Composer

Until now, most ARKit experiences have been developed using engines like Unity. For some app developers looking to add AR elements, Unity has a relatively steep learning curve and a plethora of irrelevant user interface panels and configuration to deal with.

RealityKit

RealityKit is a new high level framework from Apple made specifically for AR. It handles all aspects of rendering including materials, shadows, reflections, and even camera motion blur. It also handles networking for multiplayer AR apps, meaning developers won’t need to be network engineers to develop shared AR experiences.

The framework’s rendering engine takes full advantage of hardware acceleration in Apple chips and performant Apple APIs.

apple reality composer

Apple is also launching a new macOS tool called Reality Composer. This tool lets developers visually create AR scenes. Developers can add animations like movement, scaling, and spinning. These animations can be set to be triggered when a user taps on or comes close to an AR object.

Reality Composer scenes can be integrated directly into iOS apps using RealityKit. Alternatively, some developers can use it as a prototyping tool.

Simultaneous Cameras, Multiple Faces

ARKit 3 also adds new minor features to enable new use cases. Both cameras can now be used simultaneously for example, so a user’s facial expressions could drive the AR experience.

Additionally, the selfie camera can now track multiple people, which could open up interactive facial augmentation experiences, similar to multi-person Snapchat filters.

Supported Devices

ARKit 3’s new features are only supported on devices with Apple’s latest generation chips, the A12 and A12X. As stated on the Apple developer website:

People Occlusion and the use of motion capture, simultaneous front and back camera, and multiple face tracking are supported on devices with A12/A12X Bionic chips, ANE, and TrueDepth Camera.

That means features like body occlusion will only work on iPhone XS, iPhone XS Max, iPhone XR, and 2018 iPad Pro.

UPDATE: added information about supported devices

Weekly Newsletter

See More