Skip to main content
Version: 3.4

Sample Projects

We made the ARDK samples to show the quickest way to use each of our features. The sample has a menu and multiple small samples you can try out and look though to learn how to use our features. You can follow our How-tos to see how to add the features to your own apps.

In addition to these sample projects, we also offer Emoji Garden as a completed AR game that you can disassemble to learn about best practices for creating your own AR applications! Head to the Emoji Garden feature page to learn more about it and download the project.

Installing the Samples

  1. You can either download the sample directory manually or clone the samples repo in git.
    1. To clone the repo, use git clone https://github.com/niantic-lightship/ardk-samples.git where you want the git repository to live. Use git pull periodically to refresh the repository and download new samples when they are added.
    2. To download the sample directory, navigate to https://github.com/niantic-lightship/ardk-samples. Use the Code button and select Download Zip. Open the zip file and save the contents it in your Unity directory or wherever you store your Unity projects.
  2. Open the samples project in Unity.
  3. Navigate to the Project tab. Each sample contains a Unity scene which demonstrates one ARDK feature. These include:
    • Samples/Depth/Scenes/DepthDisplay.unity
    • Samples/Depth/Scenes/DepthOcclusion.unity
    • Samples/Semantics/Scenes/SemanticsDisplay.unity
    • Samples/Depth/Scenes/DepthSemanticSuppression.unity
    • Samples/Meshing/Scenes/NormalMeshes.unity
    • Samples/NavigationMesh/Scenes/NavigationMesh.unity
    • Samples/PersistentAR/Scenes/RemoteAuthoring.unity
    • Samples/PersistentAR/Scenes/VPSLocalization.unity
    • Samples/SharedAR/Scenes/SharedARVPS.unity
    • Samples/SharedAR/Scenes/ImageTrackingColocalization.unity
    • Samples/Scanning/Scenes/Recording.unity
    • Samples/Meshing/Scenes/TexturedMesh.unity
  4. For some samples, including RemoteAuthoring, VPSLocalization, and SharedVPS, you will need to add a Lightship key to Unity.

Running the Samples in Unity 2022

Our sample project comes linked to Unity 2021.3.29f1, but you can update it to version 2022.3.16f1 if you would prefer to work in Unity 2022. To update the sample:

  1. In Unity Hub, under Installs, install 2022.3.10f1 if you do not have it already.
  2. Under Projects, find the ARDK sample project. Click on the Editor Version and change it to 2022.3.16f1. Then click the Open with 2022.3.16f1 button.
  3. When the Change Editor version? dialog comes up, click Change Version.
  4. When the Opening Project in Non-Matching Editor Installation dialog comes up, click Continue.
  5. Disable the custom base Gradle template:
    1. In the Unity top menu, click Edit, then Project Settings.
    2. In the left-hand Project Settings menu, select Player, then click the Android tab.
    3. Scroll down to Publishing Settings, then un-check the box labeled Custom Base Gradle Template.
  6. In the Window top menu, open the Package Manager. Select Visual Scripting from the package list, then, if you are using version 1.9.0 or earlier, click the Update button.
  7. If there are any errors, the Enter Safe Mode? dialog will pop up. Click Enter Safe Mode to fix the errors.

Samples

Each sample resides in its own folder. You will see a structure like this:

\NameOfSample
\Scene
\Materials
\Prefabs
\Shaders

This makes it easy to copy one sample into another project. Everything is in one place, and you can export it into a prefab and re-import it into your new project.

tip

You may need to copy some shared materials and assets from the \Common folder.

Depth Display

The depth scene demonstrates how to get the depth buffer and display it as an overlay in the scene. Open DepthDisplay.unity in the Depth folder to try it out.

Occlusion

This scene demonstrates occlusion by moving a static cube in front of the camera. Because the cube does not move, you can walk around and inspect the occlusion quality directly. To open it, see Occlusion.unity in the Depth folder. This sample also demonstrates two advanced occlusion options available in Lightship, Occlusion Suppression and Occlusion Stabilization. These options reduce flicker and improve the visual quality of occlusions using input from either semantics or meshing. For more information on how these capabilities work, see the How-To sections for Occlusion Suppression and Occlusion Stabilization.

Semantics

This scene demonstrates semantics by applying a shader that colors anything recognized on-screen as part of a semantic channel. To open this sample, see SemanticsDisplay.unity in the Semantics folder.

To use this sample:

  1. Select a semantic channel from the drop down list.
  2. Look for the corresponding object(s) on your phone camera.

Meshing

This scene demonstrates how to use meshing to generate a physics mesh in your scene. It shows the mesh using a Normal shader, the colors represent Up, Right and Forward.

To open this sample, see NormalMeshes.unity in the Meshing folder.

Textured Mesh

This scene demonstrates how to texture a Lightship mesh. It works like the Meshing sample but uses an example tri-planar shader that demonstrates one way to do world space UV projection. The sample tiles three textures in the scene; one for the ground, the walls, and the ceiling.

To open this sample, see TexturedMesh.unity in the Meshing folder.

This scene demonstrates using meshing to create a Navigation Mesh. As you move around we create and grow a navigation mesh that you can click on to tell an AI agent to move to that position. The agent can walk around corners and jump up on objects. To open the sample, see NavigationMesh.unity in the NavigationMesh folder.

To view this demonstration:

  1. Build the scene to your device, then point your phone at your surroundings and move around. The game pieces should show after a moment.
  2. Tap on a game piece to set a destination.
  3. The Player Cube will find a path along the navigation mesh to reach the selected destination.

Remote Authoring

note

This sample only works in portrait orientation.

This scene demonstrates target localization by targeting a VPS Anchor. To open this sample, see RemoteAuthoring.unity in the PersistentAR folder.

To use this sample:

  1. Go to the Geospatial Browser.

  2. Copy the Blob for a Default Anchor of your choice.

  3. In the scene Hierarchy, navigate to XR Origin. In the Inspector window, add the Blob to the list of Default Anchor Payloads To Localize.

    Default Anchor Payloads To Localize
  4. Build the sample to your device.

  5. Physically visit the location you’ve chosen in GSB and localize to it.

  6. A green cube will appear at the mesh origin indicated in the Geospatial Browser by the Coordinate Axis Marker.

Changing the Blob at Runtime

You can open the Geospatial Browser on your test device, copy the Blob of a different anchor, and paste it into the Payload text box when the app is running.

VPS Localization

Attention

This sample requires a Lightship API Key.

This scene shows a list of VPS locations in a radius, allows you to choose a Waypoint from the Coverage API as a localization target, then interfaces with your phone's map to guide you to it. To open this sample, see VPSLocalization.unity in the PersistentAR folder.

To use this sample:

  1. Build to device and open the app. Make sure to allow location permissions.
  2. To search from your current location, set a radius and tap Request Areas. To search from another location, fill in its latitude and longitude coordinates instead.
  3. Physically visit the location and tap the Localize button.
  4. Wait for the status to change to Tracking and a cube will appear at the mesh's origin.

Shared AR VPS

Attention

This sample requires a Lightship API Key.

This scene allows you to choose a Waypoint from the Coverage API and create a shared AR experience around it. To open this sample, see SharedARVPS.unity in the SharedAR folder.

To use this sample on mobile devices:

  1. Follow instructions for VPSLocalization to localize to an available location.
  2. Physically visit the location and tap the Localize button with 2-10 other phones. This process will localize everyone to the same location and automatically join everyone into the same room.
  3. Wait for the status to change to Tracking and every player in the session will see a name tag. The name tag will turn red to indicate that player has lost tracking. The stats UI can be hidden by tapping on it, but it will not return for that session.

To use this sample with Playback in the Unity editor:

  1. Set up playback of the scene at a location. See How to Set Up Playback.
  2. Provide a default anchor payload string from the Geospatial Browser to use with playback. See How to Use the Geospatial Browser.
  3. Copy the default anchor payload string into the In Editor Payload field in the Vps Colocalization Demo component.
  4. Start the VPS Colocalization scene. It should use the payload string to automatically start tracking.
  5. When the network UI comes up, choose whether to join as Host or Client.

Shared AR Image Tracking Colocalization

Attention

This sample requires a Lightship API Key.

This scene allows multiple users to join a shared room without a VPS location, using a static image as the origin point. To open this sample, see ImageTrackingColocalization.unity in the SharedAR folder.

To use this sample:

  1. Print the image in Assets/Samples/SharedAR/IMG-2689.png so that it is 9cm wide.
  2. Place the image on a surface.
  3. Point the device camera at the image. Select Create New Room.

Recording

This scene allows you to scan a real-world location for playback in your editor. To open this sample, see Recording.unity in the Scanning folder. To learn how to use this sample, see How to Create Datasets for Playback.