ARDK 2.2 launches with experimental features like Playback Support
Learn more about playback support, person segmentation, and more.

What's new with Lightship?

We’re always looking for new ways to unlock interaction and engagement for experiences developed on the Lightship platform. In this release of ARDK, we’re launching both improved and experimental features to our mapping, developer experience, and understanding tools that will allow our developer community more freedom in what they can create.

Playback Support (Experimental)

One exciting release is Playback - an experimental tool available on Intel Macs that will allow developers to record everything in the moment and play it back in Virtual Studio. Playback allows developers to access a faster and more reliable method of developing using real AR data.

The main purpose of this tool is to enable effective at-desk development of AR experiences by providing a variety of active AR data, like meshing, depth, segmentation and way-spot anchors, without the burden of deploying onto the phone, being physically at various locations, lag, and/or USB cord tethering.

With Playback, you can capture a recording of an experience in testing at any physical location, and then share that recording with your development team to iterate on without having to return to that location. This has the potential to save hours of development time and unlock the ability for teams who are physically apart to collaborate together - instantly.

Person Segmentation

With Person Segmentation, ARDK instantly creates a mask of people as they appear in the scene, enabling developers to inform how characters and objects interact with the people in their AR experiences. This new level of segmentation makes it possible for developers to blur people from frames, add fun filters, or create virtual items like accessories or clothing.

We’re excited to offer this new layer of segmentation and wanted to make sure it was robust and unbiased by using a people-first approach. At Niantic we strive for our technology to be inclusive and fair by following strict equality and fairness practices when building, evaluating, and deploying our models. We define person segmentation fairness as follows: a model makes fair predictions if it performs equally on images that depict a variety of the identified subgroups. The evaluation results focus on measuring the performance of the person segmentation channel on the first three main subgroups (geographical region, skin tone and gender).

To learn more about how we built this model, visit here.

Palm Tracking (Experimental)

Experiences built on Lightship can now react to palm movement, meaning developers have the ability to detect an open hand that is face up and allow for simple hand based interactions in AR. With this experimental feature, people might place a creature on their hand, or have it approach their hand, or be able to push things into a scene. This is the first in a set of capabilities we will launch over time aimed at providing developers with Human AR in Lightship.

Wayspot Anchor API Updates

Wayspot Anchors enable content to be tied to a particular VPS-activated location with a consistent, stable pose that will persist across AR sessions and users. In ARDk 2.2, we’ve added some high-reward changes including eight new APIs to help you better troubleshoot and resolve Wayspot Anchors. In addition, we’ve added a new Wayspot Anchor GameObject, making it easier for developers to implement a smoothing function for position changes of GameObjects that track a WayspotAnchor.

To learn more about everything included in this release and others, view our release notes. Want to start building with these new features? Sign in or register to download Lightship ARDK 2.2.

Published August 30, 2022
Ready to build?