Depth
Depth prediction gives ARDK the ability to return an estimated depth map of the real world. With this map, applications can estimate the real-world distance (in meters) from the device camera to each pixel on the screen. This feature has a variety of use cases, including (but not limited to):
- Writing depth values into the zbuffer for occlusion.
- Placing objects in the world by fetching the 3-D position from the depth buffer.
- Constructing a mesh as part of the meshing system.
- Visual effects such as depth of field.
This is an example of using the depth camera to create a 'pulse' that travels from near to far:
This is an example of using the depth camera to create a depth of field effect, focusing on one area and slightly blurring the rest:
What's new?
ARDK 3.0 is now integrated with Unity's AR Foundation Occlusion Subsystem (opens in new tab). When Lightship is enabled in XR Settings, we provide an implementation of the subsystem that leverages Lightship’s advanced depth. As a developer, all you have to do is place the standard AROcclusionManager (opens in new tab) in your scene.
Learn more
You can see how to use depth in our depth-related How-tos: [
You can also look at our depth-related samples: