Justin Sneddon, the Group Product Manager for Niantic Lightship, here to tell you how you can supercharge your AR experiences with our latest AR development kit (ARDK). Let’s dive right in and explore the magic!
Beyond the AR Horizon
Lightship ARDK 3.0 takes what ARKit and ARCore offer in Unity via ARFoundation and cranks it up a notch. But that’s just the beginning. Lightship’s tools are designed to fill in the missing gaps and push the boundaries of computer vision technology. Buckle up, because we’re about to take you through some game-changing features like Depth, Meshing, Semantics, Navigation, Shared AR (Multiplayer), Visual Positioning (VPS), Debugging Tools (Playback and Mocking).
Depth - The Foundation of AR Awesomeness
Depth is the secret sauce behind every AR experience. It’s what helps us figure out where to place objects and how they should interact with the real world. Lightship’s depth is something truly special. Why, you ask? Well, it all comes down to our passion for getting people outdoors.
Lightship’s depth is trained on vast outdoor environments, which means it can provide incredibly accurate depth from a single camera. Plus, it’s not limited to a short range like Lidar on iPhone Pros. Lightship’s depth can reach a whopping 40+ meters, and it works on all AR-capable phones—yes, that includes all iPhones and most Androids!
And why does that extended range matter? Imagine summoning a massive dragon into your AR world—this creature has a wingspan that far exceeds the 5-meter limit. With Lightship’s long-range depth, you can place it 10 to 20 meters away from your camera and capture every breathtaking detail.
What else can you do with this supercharged depth? Let me break it down for you:
- Placement: Convert a screen point to a real-world position and place digital objects there.
- Measurement: Know the distance to anything on your screen.
- Occlusion: Use depth information to seamlessly blend digital objects into the real world.
But wait, there’s more! When you combine depth with semantics (stay tuned for that!), the possibilities become endless. Visual effects like pulse effects, depth of field, toon filters, edge detection, and more come to life. I’ll walk you through how to create the first two experiences.
Pulse Effects
This is easily doable by simply following our How To Guide for Depth which will give you the image on the far left.
You’ll then want to:
- Modify the shader to colorize at a specific depth from the camera.
- Update that to use depth + time so that it will pulse across your scene like in the second image.
- And instead of showing depth, choose to display the pulse.
You now have a radar effect! Here is the function that you can add to the shader in the Depth Overlay that will make the above experience.
You can call it in the frag shader like this:
Depth of Field
You can artificially add focus to parts of your image to mirror common photographic techniques. For example, you can use depth to say certain parts of an image should be in focus versus others that are not.
In the first image, you can see that we have just written a simple box blur. Get the 4 surrounding pixels and average them.
In the next image, we are using depth to make the amount of blur increase the further things are away from you making the effect nicer. Close things come into focus, and things that are farther away get more and more blurred.
And then in the third image, we changed the center point for the blur radius to be where you have clicked/touched on the screen. That way the user can select the part of the image they would like to have in focus.
To do this you can make a few small tweaks to our Depth How-To Guide by adding a blur and passing it a point that you pick on the screen. Here are some of the key code snippets for this example.
In the shader, you will need to sample a few pixels and blur them.
Be sure to update this to add a small amount of logic to say distance from the point you pick on screen, instead of using depth/100.0f as the blur amount.
And, there you have it, folks! Niantic Lightship is all about taking your AR game to new heights. If you’re as excited as I am, you can dig deeper into these features with my upcoming blog posts, complete with examples and source code.
Next up, I’ll dive into Meshing, Semantics and Navigation. Until then, stay curious and stay creative! 🚀✨