Skip to main content

Glossary

Agent - A character who can navigate on a Navigation Mesh.

AR Scene - A Unity Scene that has an ARSession and XROrigin setup to enable AR. See Setting up an AR Scene.

Augmented Reality (AR) - Augmented Reality is the result of using real-world locations and adding digital objects and experiences to those places.

Colocalization - The process of placing two or more users into a Shared AR experience based on their location. See Shared AR.

Component - A Unity Component.

Dataset - A recording of your AR application in a real-world location for playback and testing in the Unity editor. See How to Create Datasets for Playback.

Depth - An estimation of how far objects are from the camera. This is used to place AR objects convincingly. See Depth.

Depth Occlusion - Using depth to estimate whether an AR object would be visually blocked by a real-world object. See the Depth Occlusion sample.

Geospatial Browser - Allows you to search and find VPS-Activated Locations across the globe. You can select the location you want to use for your project and download the associated 3D mesh assets. See How to Use the Geospatial Browser.

Image Tracking Colocalization - Using an image in the real world to Colocalize users. See Shared AR.

Localization - The process of placing a user into an AR experience. See Creating Location AR Experiences with VPS.

Location AR - Using a real-world location to act as the center of an AR experience. See Creating Location AR Experiences with VPS.

Meshing - Meshing uses depth and tracking data to generate a mesh representing the estimated geometry of the scanned real world. See Meshing.

Mock Colocalization - Mock Colocalization uses the Colocalization feature to test networking without creating a shared AR experience. See Shared AR.

Navigation Mesh - A special kind of mesh that allows AR objects to navigate real-world obstacles. See Navigation Mesh.

Neural Network Model - Neural Network Models are trained to allow features such as Depth or Semantic Segmentation to know how to draw each pixel in an AR environment. See Neural Network Model Preloading.

Niantic Wayfarer - An app that allows you to capture and submit scans at real-world locations to improve the coverage of Lightship VPS. See How to Install and Use Niantic Wayfarer.

Object Detection - Identify a subset of objects captured by the camera. See Object Detection for more details and which objects can be identified.

Occlusion - Occlusion gives depth to virtual objects, allowing them to appear behind or in front of objects in the real world. See Occlusion.

ParrelSync - ParrelSync is a Unity editor extension that allows users to test multiplayer gameplay without building the project by having another Unity editor window opened and mirror the changes from the original project.

Persistent Anchors - An object placed in an AR scene to serve as the shared reference point (hence 'anchor') for other network spawned objects in the scene. The anchor is shared between all participants in a shared AR experience and persists between sessions.

Playback - A feature that allows you to import pre-recorded video of specific locations (such as a Dataset) and run it in the Unity editor. See Playback.

Playback Data - A Dataset you created for playback in the Unity editor. See How to Create Datasets for Playback.

Private VPS Locations - A VPS location in your surrounding area used to test in the Niantic Wayfarer App.

Project Validation - Niantic Lightship offers an upgraded version of the Unity Project Validation System, allowing developers to check their projects for common errors, such as configuration issues in Scenes and Projects. See Project Validation.

Real-World Occlusion - Depth Occlusion as applied to real-world objects. See Occlusion.

Real-World Objects - Objects in the real world identified by Semantic Segmentation. See How to Query Semantics to Find Real-World Objects.

Real-World Position - The position in the real world that matches a screen point on a scene captured by a camera. See How to Convert a Screen Point to Real-World Position Using Depth.

Room - A Shared Room where multiple players can participate in the same Shared AR experience.

Simulation - The ability to move the camera in a virtual environment to test AR features. See How to Set Up and Run Lightship Simulation for details.

Screen Point - A 2D position in a camera image. See How to Convert a Screen Point to Real-World Position Using Depth.

Semantics - See Semantic Segmentation.

Semantic Depth Suppression - Enables a user to holdout depth values for specific semantic channels to the far depth plane. See Lightship Occlusion Extension.

Semantic Segmentation - The process of assigning class labels to specific regions in an image. See Semantics.

Shared AR - An AR experience that multiple players can participate in. See Shared AR.

Shared Room - A virtual space where multiple players can participate in the same Shared AR experience.

Shared Objects - A shared object is a virtual object in a real-world location visible to multiple players. See How to Display Shared Objects.

Test Scans - A Private VPS Location used for testing.

Visual Positioning System (VPS) - Lightship VPS lets you synchronize your device with real-world locations by locating and understanding real-world VPS-Activated Locations. See Creating Location AR Experiences with VPS for more information.

VPS-Activated Location - A unique or notable, publicly accessible, real-world location that Lightship VPS apps can engage with. See Creating Location AR Experiences with VPS.

VPS Coverage Areas - Geographic regions where users can localize with VPS and interact with persistent AR content. See Creating Location AR Experiences with VPS.

VPS Localization - The process of using a VPS-Activated Location to start a Shared AR experience. See Creating Location AR Experiences with VPS.

World Positioning System (WPS) - Provides a stable geographic position and orientation for the device and support for conversions between geographic and AR tracking coordinates. See Building an App Using WPS.